[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 18662 1726867305.08427: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Isn executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 18662 1726867305.08705: Added group all to inventory 18662 1726867305.08707: Added group ungrouped to inventory 18662 1726867305.08709: Group all now contains ungrouped 18662 1726867305.08711: Examining possible inventory source: /tmp/network-5rw/inventory.yml 18662 1726867305.17284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 18662 1726867305.17325: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 18662 1726867305.17341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 18662 1726867305.17380: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 18662 1726867305.17431: Loaded config def from plugin (inventory/script) 18662 1726867305.17433: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 18662 1726867305.17461: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 18662 1726867305.17521: Loaded config def from plugin (inventory/yaml) 18662 1726867305.17523: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 18662 1726867305.17581: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 18662 1726867305.17855: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 18662 1726867305.17857: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 18662 1726867305.17859: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 18662 1726867305.17864: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 18662 1726867305.17867: Loading data from /tmp/network-5rw/inventory.yml 18662 1726867305.17908: /tmp/network-5rw/inventory.yml was not parsable by auto 18662 1726867305.17953: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 18662 1726867305.17981: Loading data from /tmp/network-5rw/inventory.yml 18662 1726867305.18033: group all already in inventory 18662 1726867305.18038: set inventory_file for managed_node1 18662 1726867305.18040: set inventory_dir for managed_node1 18662 1726867305.18041: Added host managed_node1 to inventory 18662 1726867305.18042: Added host managed_node1 to group all 18662 1726867305.18043: set ansible_host for managed_node1 18662 1726867305.18044: set ansible_ssh_extra_args for managed_node1 18662 1726867305.18045: set inventory_file for managed_node2 18662 1726867305.18047: set inventory_dir for managed_node2 18662 1726867305.18047: Added host managed_node2 to inventory 18662 1726867305.18048: Added host managed_node2 to group all 18662 1726867305.18049: set ansible_host for managed_node2 18662 1726867305.18049: set ansible_ssh_extra_args for managed_node2 18662 1726867305.18051: set inventory_file for managed_node3 18662 1726867305.18052: set inventory_dir for managed_node3 18662 1726867305.18053: Added host managed_node3 to inventory 18662 1726867305.18053: Added host managed_node3 to group all 18662 1726867305.18054: set ansible_host for managed_node3 18662 1726867305.18054: set ansible_ssh_extra_args for managed_node3 18662 1726867305.18056: Reconcile groups and hosts in inventory. 18662 1726867305.18058: Group ungrouped now contains managed_node1 18662 1726867305.18060: Group ungrouped now contains managed_node2 18662 1726867305.18061: Group ungrouped now contains managed_node3 18662 1726867305.18114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 18662 1726867305.18195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 18662 1726867305.18226: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 18662 1726867305.18244: Loaded config def from plugin (vars/host_group_vars) 18662 1726867305.18245: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 18662 1726867305.18250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 18662 1726867305.18255: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 18662 1726867305.18286: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 18662 1726867305.18512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867305.18575: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 18662 1726867305.18603: Loaded config def from plugin (connection/local) 18662 1726867305.18605: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 18662 1726867305.18985: Loaded config def from plugin (connection/paramiko_ssh) 18662 1726867305.18988: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 18662 1726867305.19532: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18662 1726867305.19555: Loaded config def from plugin (connection/psrp) 18662 1726867305.19557: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 18662 1726867305.19954: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18662 1726867305.19978: Loaded config def from plugin (connection/ssh) 18662 1726867305.19981: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 18662 1726867305.21291: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 18662 1726867305.21315: Loaded config def from plugin (connection/winrm) 18662 1726867305.21316: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 18662 1726867305.21337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 18662 1726867305.21383: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 18662 1726867305.21423: Loaded config def from plugin (shell/cmd) 18662 1726867305.21424: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 18662 1726867305.21440: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 18662 1726867305.21480: Loaded config def from plugin (shell/powershell) 18662 1726867305.21482: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 18662 1726867305.21517: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 18662 1726867305.21621: Loaded config def from plugin (shell/sh) 18662 1726867305.21623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 18662 1726867305.21644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 18662 1726867305.21717: Loaded config def from plugin (become/runas) 18662 1726867305.21719: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 18662 1726867305.21827: Loaded config def from plugin (become/su) 18662 1726867305.21829: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 18662 1726867305.21924: Loaded config def from plugin (become/sudo) 18662 1726867305.21925: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 18662 1726867305.21947: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 18662 1726867305.22155: in VariableManager get_vars() 18662 1726867305.22169: done with get_vars() 18662 1726867305.22258: trying /usr/local/lib/python3.12/site-packages/ansible/modules 18662 1726867305.24897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 18662 1726867305.24972: in VariableManager get_vars() 18662 1726867305.24976: done with get_vars() 18662 1726867305.24980: variable 'playbook_dir' from source: magic vars 18662 1726867305.24980: variable 'ansible_playbook_python' from source: magic vars 18662 1726867305.24981: variable 'ansible_config_file' from source: magic vars 18662 1726867305.24981: variable 'groups' from source: magic vars 18662 1726867305.24982: variable 'omit' from source: magic vars 18662 1726867305.24982: variable 'ansible_version' from source: magic vars 18662 1726867305.24983: variable 'ansible_check_mode' from source: magic vars 18662 1726867305.24983: variable 'ansible_diff_mode' from source: magic vars 18662 1726867305.24984: variable 'ansible_forks' from source: magic vars 18662 1726867305.24984: variable 'ansible_inventory_sources' from source: magic vars 18662 1726867305.24984: variable 'ansible_skip_tags' from source: magic vars 18662 1726867305.24985: variable 'ansible_limit' from source: magic vars 18662 1726867305.24985: variable 'ansible_run_tags' from source: magic vars 18662 1726867305.24986: variable 'ansible_verbosity' from source: magic vars 18662 1726867305.25009: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml 18662 1726867305.25415: in VariableManager get_vars() 18662 1726867305.25427: done with get_vars() 18662 1726867305.25448: in VariableManager get_vars() 18662 1726867305.25461: done with get_vars() 18662 1726867305.25485: in VariableManager get_vars() 18662 1726867305.25493: done with get_vars() 18662 1726867305.25511: in VariableManager get_vars() 18662 1726867305.25518: done with get_vars() 18662 1726867305.25566: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18662 1726867305.25686: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18662 1726867305.25770: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18662 1726867305.26136: in VariableManager get_vars() 18662 1726867305.26149: done with get_vars() 18662 1726867305.26467: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 18662 1726867305.26599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18662 1726867305.27790: in VariableManager get_vars() 18662 1726867305.27806: done with get_vars() 18662 1726867305.27925: in VariableManager get_vars() 18662 1726867305.27929: done with get_vars() 18662 1726867305.27931: variable 'playbook_dir' from source: magic vars 18662 1726867305.27932: variable 'ansible_playbook_python' from source: magic vars 18662 1726867305.27932: variable 'ansible_config_file' from source: magic vars 18662 1726867305.27933: variable 'groups' from source: magic vars 18662 1726867305.27934: variable 'omit' from source: magic vars 18662 1726867305.27935: variable 'ansible_version' from source: magic vars 18662 1726867305.27935: variable 'ansible_check_mode' from source: magic vars 18662 1726867305.27936: variable 'ansible_diff_mode' from source: magic vars 18662 1726867305.27937: variable 'ansible_forks' from source: magic vars 18662 1726867305.27937: variable 'ansible_inventory_sources' from source: magic vars 18662 1726867305.27938: variable 'ansible_skip_tags' from source: magic vars 18662 1726867305.27939: variable 'ansible_limit' from source: magic vars 18662 1726867305.27940: variable 'ansible_run_tags' from source: magic vars 18662 1726867305.27940: variable 'ansible_verbosity' from source: magic vars 18662 1726867305.27971: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 18662 1726867305.28037: in VariableManager get_vars() 18662 1726867305.28040: done with get_vars() 18662 1726867305.28042: variable 'playbook_dir' from source: magic vars 18662 1726867305.28043: variable 'ansible_playbook_python' from source: magic vars 18662 1726867305.28044: variable 'ansible_config_file' from source: magic vars 18662 1726867305.28044: variable 'groups' from source: magic vars 18662 1726867305.28045: variable 'omit' from source: magic vars 18662 1726867305.28046: variable 'ansible_version' from source: magic vars 18662 1726867305.28046: variable 'ansible_check_mode' from source: magic vars 18662 1726867305.28047: variable 'ansible_diff_mode' from source: magic vars 18662 1726867305.28048: variable 'ansible_forks' from source: magic vars 18662 1726867305.28048: variable 'ansible_inventory_sources' from source: magic vars 18662 1726867305.28049: variable 'ansible_skip_tags' from source: magic vars 18662 1726867305.28050: variable 'ansible_limit' from source: magic vars 18662 1726867305.28051: variable 'ansible_run_tags' from source: magic vars 18662 1726867305.28051: variable 'ansible_verbosity' from source: magic vars 18662 1726867305.28083: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 18662 1726867305.28157: in VariableManager get_vars() 18662 1726867305.28169: done with get_vars() 18662 1726867305.28211: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18662 1726867305.28320: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18662 1726867305.28396: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18662 1726867305.28821: in VariableManager get_vars() 18662 1726867305.28840: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18662 1726867305.30302: in VariableManager get_vars() 18662 1726867305.30321: done with get_vars() 18662 1726867305.30357: in VariableManager get_vars() 18662 1726867305.30360: done with get_vars() 18662 1726867305.30362: variable 'playbook_dir' from source: magic vars 18662 1726867305.30363: variable 'ansible_playbook_python' from source: magic vars 18662 1726867305.30364: variable 'ansible_config_file' from source: magic vars 18662 1726867305.30364: variable 'groups' from source: magic vars 18662 1726867305.30365: variable 'omit' from source: magic vars 18662 1726867305.30366: variable 'ansible_version' from source: magic vars 18662 1726867305.30366: variable 'ansible_check_mode' from source: magic vars 18662 1726867305.30367: variable 'ansible_diff_mode' from source: magic vars 18662 1726867305.30368: variable 'ansible_forks' from source: magic vars 18662 1726867305.30369: variable 'ansible_inventory_sources' from source: magic vars 18662 1726867305.30369: variable 'ansible_skip_tags' from source: magic vars 18662 1726867305.30370: variable 'ansible_limit' from source: magic vars 18662 1726867305.30371: variable 'ansible_run_tags' from source: magic vars 18662 1726867305.30371: variable 'ansible_verbosity' from source: magic vars 18662 1726867305.30403: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 18662 1726867305.30474: in VariableManager get_vars() 18662 1726867305.30488: done with get_vars() 18662 1726867305.30530: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 18662 1726867305.32127: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 18662 1726867305.32206: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 18662 1726867305.32588: in VariableManager get_vars() 18662 1726867305.32605: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18662 1726867305.34168: in VariableManager get_vars() 18662 1726867305.34184: done with get_vars() 18662 1726867305.34222: in VariableManager get_vars() 18662 1726867305.34234: done with get_vars() 18662 1726867305.34295: in VariableManager get_vars() 18662 1726867305.34307: done with get_vars() 18662 1726867305.34400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 18662 1726867305.34414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 18662 1726867305.34689: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 18662 1726867305.34871: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 18662 1726867305.34874: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 18662 1726867305.34907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 18662 1726867305.34934: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 18662 1726867305.35101: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 18662 1726867305.35159: Loaded config def from plugin (callback/default) 18662 1726867305.35162: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18662 1726867305.36275: Loaded config def from plugin (callback/junit) 18662 1726867305.36280: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18662 1726867305.36310: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 18662 1726867305.36349: Loaded config def from plugin (callback/minimal) 18662 1726867305.36350: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18662 1726867305.36376: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 18662 1726867305.36417: Loaded config def from plugin (callback/tree) 18662 1726867305.36421: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 18662 1726867305.36502: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 18662 1726867305.36504: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ethernet_nm.yml ************************************************ 10 plays in /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 18662 1726867305.36524: in VariableManager get_vars() 18662 1726867305.36531: done with get_vars() 18662 1726867305.36535: in VariableManager get_vars() 18662 1726867305.36540: done with get_vars() 18662 1726867305.36543: variable 'omit' from source: magic vars 18662 1726867305.36564: in VariableManager get_vars() 18662 1726867305.36572: done with get_vars() 18662 1726867305.36587: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ethernet.yml' with nm as provider] ********* 18662 1726867305.36952: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 18662 1726867305.37002: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 18662 1726867305.37026: getting the remaining hosts for this loop 18662 1726867305.37027: done getting the remaining hosts for this loop 18662 1726867305.37029: getting the next task for host managed_node2 18662 1726867305.37032: done getting next task for host managed_node2 18662 1726867305.37034: ^ task is: TASK: Gathering Facts 18662 1726867305.37036: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867305.37040: getting variables 18662 1726867305.37041: in VariableManager get_vars() 18662 1726867305.37048: Calling all_inventory to load vars for managed_node2 18662 1726867305.37049: Calling groups_inventory to load vars for managed_node2 18662 1726867305.37051: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867305.37058: Calling all_plugins_play to load vars for managed_node2 18662 1726867305.37065: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867305.37067: Calling groups_plugins_play to load vars for managed_node2 18662 1726867305.37090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867305.37125: done with get_vars() 18662 1726867305.37130: done getting variables 18662 1726867305.37174: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 Friday 20 September 2024 17:21:45 -0400 (0:00:00.007) 0:00:00.007 ****** 18662 1726867305.37190: entering _queue_task() for managed_node2/gather_facts 18662 1726867305.37191: Creating lock for gather_facts 18662 1726867305.37469: worker is 1 (out of 1 available) 18662 1726867305.37481: exiting _queue_task() for managed_node2/gather_facts 18662 1726867305.37493: done queuing things up, now waiting for results queue to drain 18662 1726867305.37496: waiting for pending results... 18662 1726867305.37731: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18662 1726867305.37879: in run() - task 0affcac9-a3a5-efab-a8ce-00000000007c 18662 1726867305.37883: variable 'ansible_search_path' from source: unknown 18662 1726867305.37886: calling self._execute() 18662 1726867305.37888: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867305.37891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867305.37894: variable 'omit' from source: magic vars 18662 1726867305.38195: variable 'omit' from source: magic vars 18662 1726867305.38199: variable 'omit' from source: magic vars 18662 1726867305.38284: variable 'omit' from source: magic vars 18662 1726867305.38288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867305.38584: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867305.38587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867305.38590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867305.38592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867305.38594: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867305.38596: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867305.38599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867305.38698: Set connection var ansible_timeout to 10 18662 1726867305.38790: Set connection var ansible_connection to ssh 18662 1726867305.38801: Set connection var ansible_shell_executable to /bin/sh 18662 1726867305.38808: Set connection var ansible_shell_type to sh 18662 1726867305.38982: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867305.38986: Set connection var ansible_pipelining to False 18662 1726867305.38988: variable 'ansible_shell_executable' from source: unknown 18662 1726867305.38990: variable 'ansible_connection' from source: unknown 18662 1726867305.38992: variable 'ansible_module_compression' from source: unknown 18662 1726867305.38994: variable 'ansible_shell_type' from source: unknown 18662 1726867305.38997: variable 'ansible_shell_executable' from source: unknown 18662 1726867305.38999: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867305.39001: variable 'ansible_pipelining' from source: unknown 18662 1726867305.39003: variable 'ansible_timeout' from source: unknown 18662 1726867305.39005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867305.39297: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867305.39301: variable 'omit' from source: magic vars 18662 1726867305.39303: starting attempt loop 18662 1726867305.39305: running the handler 18662 1726867305.39307: variable 'ansible_facts' from source: unknown 18662 1726867305.39363: _low_level_execute_command(): starting 18662 1726867305.39382: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867305.40200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867305.40222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867305.40240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867305.40316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867305.42017: stdout chunk (state=3): >>>/root <<< 18662 1726867305.42153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867305.42168: stderr chunk (state=3): >>><<< 18662 1726867305.42181: stdout chunk (state=3): >>><<< 18662 1726867305.42210: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867305.42230: _low_level_execute_command(): starting 18662 1726867305.42244: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867305.4221702-18670-233934775389515 `" && echo ansible-tmp-1726867305.4221702-18670-233934775389515="` echo /root/.ansible/tmp/ansible-tmp-1726867305.4221702-18670-233934775389515 `" ) && sleep 0' 18662 1726867305.42900: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18662 1726867305.42990: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867305.43024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867305.43040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867305.43123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867305.45236: stdout chunk (state=3): >>>ansible-tmp-1726867305.4221702-18670-233934775389515=/root/.ansible/tmp/ansible-tmp-1726867305.4221702-18670-233934775389515 <<< 18662 1726867305.45283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867305.45286: stdout chunk (state=3): >>><<< 18662 1726867305.45289: stderr chunk (state=3): >>><<< 18662 1726867305.45475: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867305.4221702-18670-233934775389515=/root/.ansible/tmp/ansible-tmp-1726867305.4221702-18670-233934775389515 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867305.45480: variable 'ansible_module_compression' from source: unknown 18662 1726867305.45482: ANSIBALLZ: Using generic lock for ansible.legacy.setup 18662 1726867305.45587: ANSIBALLZ: Acquiring lock 18662 1726867305.45596: ANSIBALLZ: Lock acquired: 140264020905808 18662 1726867305.45604: ANSIBALLZ: Creating module 18662 1726867305.79525: ANSIBALLZ: Writing module into payload 18662 1726867305.79682: ANSIBALLZ: Writing module 18662 1726867305.79708: ANSIBALLZ: Renaming module 18662 1726867305.79722: ANSIBALLZ: Done creating module 18662 1726867305.79748: variable 'ansible_facts' from source: unknown 18662 1726867305.79759: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867305.79771: _low_level_execute_command(): starting 18662 1726867305.79787: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 18662 1726867305.80571: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867305.80605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867305.80633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867305.80657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867305.80727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867305.82479: stdout chunk (state=3): >>>PLATFORM <<< 18662 1726867305.82522: stdout chunk (state=3): >>>Linux <<< 18662 1726867305.82621: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 18662 1726867305.82929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867305.82939: stdout chunk (state=3): >>><<< 18662 1726867305.82960: stderr chunk (state=3): >>><<< 18662 1726867305.83146: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867305.83153 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 18662 1726867305.83156: _low_level_execute_command(): starting 18662 1726867305.83158: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 18662 1726867305.83266: Sending initial data 18662 1726867305.83327: Sent initial data (1181 bytes) 18662 1726867305.84325: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867305.84341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867305.84469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 18662 1726867305.84491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867305.84594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867305.84740: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867305.84763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867305.84811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867305.84907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867305.88446: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 18662 1726867305.88767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867305.88807: stderr chunk (state=3): >>><<< 18662 1726867305.89085: stdout chunk (state=3): >>><<< 18662 1726867305.89088: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867305.89096: variable 'ansible_facts' from source: unknown 18662 1726867305.89103: variable 'ansible_facts' from source: unknown 18662 1726867305.89117: variable 'ansible_module_compression' from source: unknown 18662 1726867305.89163: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18662 1726867305.89224: variable 'ansible_facts' from source: unknown 18662 1726867305.89572: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867305.4221702-18670-233934775389515/AnsiballZ_setup.py 18662 1726867305.89976: Sending initial data 18662 1726867305.89981: Sent initial data (154 bytes) 18662 1726867305.91065: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867305.91069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867305.91072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867305.91158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867305.91295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867305.92810: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867305.92952: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867305.92989: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmp5yylobpn /root/.ansible/tmp/ansible-tmp-1726867305.4221702-18670-233934775389515/AnsiballZ_setup.py <<< 18662 1726867305.93035: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867305.4221702-18670-233934775389515/AnsiballZ_setup.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmp5yylobpn" to remote "/root/.ansible/tmp/ansible-tmp-1726867305.4221702-18670-233934775389515/AnsiballZ_setup.py" <<< 18662 1726867305.93074: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867305.4221702-18670-233934775389515/AnsiballZ_setup.py" <<< 18662 1726867305.96107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867305.96335: stderr chunk (state=3): >>><<< 18662 1726867305.96338: stdout chunk (state=3): >>><<< 18662 1726867305.96341: done transferring module to remote 18662 1726867305.96343: _low_level_execute_command(): starting 18662 1726867305.96345: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867305.4221702-18670-233934775389515/ /root/.ansible/tmp/ansible-tmp-1726867305.4221702-18670-233934775389515/AnsiballZ_setup.py && sleep 0' 18662 1726867305.97493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867305.97503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867305.97515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867305.97534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867305.97676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867305.97988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867305.98074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867305.99897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867306.00030: stderr chunk (state=3): >>><<< 18662 1726867306.00033: stdout chunk (state=3): >>><<< 18662 1726867306.00051: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867306.00054: _low_level_execute_command(): starting 18662 1726867306.00059: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867305.4221702-18670-233934775389515/AnsiballZ_setup.py && sleep 0' 18662 1726867306.01215: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867306.01323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867306.01326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867306.01478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867306.01494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867306.01503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867306.01582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867306.03775: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 18662 1726867306.03801: stdout chunk (state=3): >>>import _imp # builtin <<< 18662 1726867306.03831: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 18662 1726867306.03838: stdout chunk (state=3): >>>import '_weakref' # <<< 18662 1726867306.03933: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 18662 1726867306.03940: stdout chunk (state=3): >>>import 'posix' # <<< 18662 1726867306.03981: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 18662 1726867306.04113: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 18662 1726867306.04339: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd1bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd18bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 18662 1726867306.04344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd1bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 18662 1726867306.04583: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd1cd130> <<< 18662 1726867306.04689: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd1cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 18662 1726867306.05303: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcfcbdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcfcbfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 18662 1726867306.05409: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 18662 1726867306.05413: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd003800> <<< 18662 1726867306.05442: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 18662 1726867306.05446: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd003e90> <<< 18662 1726867306.05559: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcfe3aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcfe11c0> <<< 18662 1726867306.05679: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcfc8f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 18662 1726867306.05695: stdout chunk (state=3): >>>import '_sre' # <<< 18662 1726867306.05909: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd023770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd022390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcfe2090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd020ad0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 18662 1726867306.05929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd058800> <<< 18662 1726867306.05932: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcfc8200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 18662 1726867306.05971: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fd058cb0> <<< 18662 1726867306.05974: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd058b60> <<< 18662 1726867306.06013: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867306.06016: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fd058ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcfc6d20> <<< 18662 1726867306.06082: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 18662 1726867306.06194: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd059550> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd059220> import 'importlib.machinery' # <<< 18662 1726867306.06328: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd05a450> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd070680> import 'errno' # <<< 18662 1726867306.06346: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fd071d60> <<< 18662 1726867306.06434: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd072c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867306.06703: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fd073260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd072150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fd073ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd073410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd05a4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 18662 1726867306.06713: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fcd73bc0> <<< 18662 1726867306.06732: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 18662 1726867306.06745: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fcd9c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcd9c410> <<< 18662 1726867306.06775: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fcd9c6e0> <<< 18662 1726867306.06807: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 18662 1726867306.06981: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867306.07002: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fcd9d010> <<< 18662 1726867306.07125: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fcd9d9d0> <<< 18662 1726867306.07140: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcd9c8c0> <<< 18662 1726867306.07151: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcd71d60> <<< 18662 1726867306.07176: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 18662 1726867306.07194: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 18662 1726867306.07213: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 18662 1726867306.07301: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcd9edb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcd9daf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd05aba0> <<< 18662 1726867306.07304: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 18662 1726867306.07500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcdcb140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 18662 1726867306.07507: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867306.07532: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 18662 1726867306.07547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 18662 1726867306.07585: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcdeb500> <<< 18662 1726867306.07644: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 18662 1726867306.07657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 18662 1726867306.07696: stdout chunk (state=3): >>>import 'ntpath' # <<< 18662 1726867306.07799: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fce4c290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 18662 1726867306.08310: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fce4e9f0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fce4c3b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fce112b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc7253a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcdea300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcd9fd10> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f66fcdea660> <<< 18662 1726867306.08525: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_raxhq18q/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 18662 1726867306.08801: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 18662 1726867306.08835: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc78b0b0> <<< 18662 1726867306.08851: stdout chunk (state=3): >>>import '_typing' # <<< 18662 1726867306.09100: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc769fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc769100> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867306.09113: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.09135: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 18662 1726867306.10608: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.11713: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc788f50> <<< 18662 1726867306.11982: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc7ba960> <<< 18662 1726867306.11986: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc7ba720> <<< 18662 1726867306.11988: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc7ba030> <<< 18662 1726867306.11991: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 18662 1726867306.12198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 18662 1726867306.12215: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc7ba480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc78bad0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc7bb6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc7bb800> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc7bbd10> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 18662 1726867306.12239: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 18662 1726867306.12283: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc625a90> <<< 18662 1726867306.12534: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc6276b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc627f80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc6291f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 18662 1726867306.12586: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc62bce0> <<< 18662 1726867306.12629: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fcfc6cc0> <<< 18662 1726867306.12649: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc629fa0> <<< 18662 1726867306.12667: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 18662 1726867306.12800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 18662 1726867306.12898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 18662 1726867306.12918: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc633d40> import '_tokenize' # <<< 18662 1726867306.13231: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc632810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc632570> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc632ae0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc62a4b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc677fb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc678080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 18662 1726867306.13286: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 18662 1726867306.13300: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc679b50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc679910> <<< 18662 1726867306.13318: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 18662 1726867306.13348: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 18662 1726867306.13883: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc67c0e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc67a240> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc67f890> <<< 18662 1726867306.13889: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc67c260> <<< 18662 1726867306.13891: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc6808f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc680a70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc680a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc678230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 18662 1726867306.13922: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 18662 1726867306.13934: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867306.13963: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc5081d0> <<< 18662 1726867306.14228: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867306.14231: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc509490> <<< 18662 1726867306.14234: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc682960> <<< 18662 1726867306.14236: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867306.14239: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc683d10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc6825d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 18662 1726867306.14289: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.14320: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.14413: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.14447: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 18662 1726867306.14450: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.14468: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 18662 1726867306.14504: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.14664: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.14723: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.15288: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.15884: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 18662 1726867306.15888: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 18662 1726867306.16110: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc5114c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc512180> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5095b0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 18662 1726867306.16130: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.16144: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 18662 1726867306.16160: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.16299: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.16454: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 18662 1726867306.16473: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5121b0> <<< 18662 1726867306.16486: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.16946: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.17484: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.17487: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.17540: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 18662 1726867306.17558: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.17662: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 18662 1726867306.17710: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.17797: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 18662 1726867306.17844: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.17850: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 18662 1726867306.17891: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.17905: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.17955: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 18662 1726867306.17966: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.18175: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.18400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 18662 1726867306.18469: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 18662 1726867306.18482: stdout chunk (state=3): >>>import '_ast' # <<< 18662 1726867306.18686: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5132c0> <<< 18662 1726867306.18689: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.18695: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.18725: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 18662 1726867306.18736: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 18662 1726867306.18791: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.18803: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.18840: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 18662 1726867306.18897: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.18941: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.18999: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.19066: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 18662 1726867306.19118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867306.19282: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867306.19286: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc51dcd0> <<< 18662 1726867306.19288: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc51b080> <<< 18662 1726867306.19354: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 18662 1726867306.19380: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.19422: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.19457: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.19675: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 18662 1726867306.19680: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 18662 1726867306.19683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 18662 1726867306.19685: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 18662 1726867306.19687: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 18662 1726867306.19729: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc6066c0> <<< 18662 1726867306.19771: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc6fe390> <<< 18662 1726867306.19870: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc51ddf0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5107d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 18662 1726867306.20006: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.20031: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 18662 1726867306.20051: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 18662 1726867306.20099: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.20165: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.20179: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.20208: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.20253: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.20295: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.20330: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.20557: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 18662 1726867306.20560: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.20573: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.20599: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 18662 1726867306.20804: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.21007: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.21028: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.21063: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867306.21120: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 18662 1726867306.21145: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 18662 1726867306.21287: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5b1d90> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 18662 1726867306.21291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 18662 1726867306.21296: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 18662 1726867306.21321: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 18662 1726867306.21341: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc19bc50> <<< 18662 1726867306.21364: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867306.21468: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc19bf80> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5b33e0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5b2900> <<< 18662 1726867306.21482: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5b0560> <<< 18662 1726867306.21506: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5b1dc0> <<< 18662 1726867306.21539: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 18662 1726867306.21784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 18662 1726867306.21787: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 18662 1726867306.21790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc1b2f30> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc1b27e0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc1b29c0> <<< 18662 1726867306.21792: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc1b1c10> <<< 18662 1726867306.21794: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 18662 1726867306.21870: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 18662 1726867306.21905: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc1b3050> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 18662 1726867306.21938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 18662 1726867306.21992: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc209b50> <<< 18662 1726867306.22032: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc1b3b30> <<< 18662 1726867306.22121: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5b01a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 18662 1726867306.22139: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 18662 1726867306.22162: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.22222: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 18662 1726867306.22250: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.22434: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.22437: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 18662 1726867306.22443: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.22447: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.22488: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 18662 1726867306.22507: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.22797: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867306.22980: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.22984: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 18662 1726867306.22986: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 18662 1726867306.23401: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.23838: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 18662 1726867306.23905: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18662 1726867306.23955: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.24003: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.24022: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 18662 1726867306.24070: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 18662 1726867306.24091: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.24221: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 18662 1726867306.24242: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 18662 1726867306.24280: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.24305: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 18662 1726867306.24328: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.24351: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.24602: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 18662 1726867306.24607: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.24610: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.24612: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 18662 1726867306.24614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 18662 1726867306.24619: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc20b320> <<< 18662 1726867306.24621: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 18662 1726867306.24652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 18662 1726867306.24771: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc20a4e0> import 'ansible.module_utils.facts.system.local' # <<< 18662 1726867306.24834: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.24848: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.24983: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 18662 1726867306.24986: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.25019: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.25104: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 18662 1726867306.25140: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.25182: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.25359: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 18662 1726867306.25362: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.25364: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.25391: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 18662 1726867306.25421: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 18662 1726867306.25498: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867306.25591: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc249d60> <<< 18662 1726867306.25726: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc22ec00> <<< 18662 1726867306.25874: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 18662 1726867306.25950: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.26033: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.26230: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.26302: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 18662 1726867306.26418: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 18662 1726867306.26433: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.26488: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 18662 1726867306.26527: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867306.26550: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc251ac0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc253470> <<< 18662 1726867306.26784: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 18662 1726867306.26787: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.26790: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 18662 1726867306.27194: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867306.27219: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.27453: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867306.27498: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.27675: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 18662 1726867306.27681: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.27784: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.28400: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867306.28546: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.29071: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 18662 1726867306.29082: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.29188: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.29503: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # <<< 18662 1726867306.29515: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.29668: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.29911: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867306.29945: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 18662 1726867306.30198: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867306.30371: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.30800: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867306.31027: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867306.31042: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 18662 1726867306.31057: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.31113: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.31183: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 18662 1726867306.31187: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.31462: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.31983: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867306.31990: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 18662 1726867306.31993: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.31995: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.31998: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 18662 1726867306.32200: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867306.32402: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 18662 1726867306.32418: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.32436: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.32556: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.32559: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.32900: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 18662 1726867306.32997: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.33302: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 18662 1726867306.33348: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.33396: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 18662 1726867306.33413: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.33699: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867306.33761: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 18662 1726867306.33781: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 18662 1726867306.33863: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867306.34401: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc04e3c0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc04fc80> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc047f80> <<< 18662 1726867306.46466: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 18662 1726867306.46492: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc0962a0> <<< 18662 1726867306.46519: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 18662 1726867306.46542: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 18662 1726867306.46703: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc095070> <<< 18662 1726867306.46707: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 18662 1726867306.46710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867306.46712: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 18662 1726867306.46717: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc097350> <<< 18662 1726867306.46924: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc095f70> <<< 18662 1726867306.46943: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 18662 1726867306.70858: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "46", "epoch": "1726867306", "epoch_int": "1726867306", "date": "2024-09-20", "time": "17:21:46", "iso8601_micro": "2024-09-20T21:21:46.352625Z", "iso8601": "2024-09-20T21:21:46Z", "iso8601_basic": "20240920T172146352625", "iso8601_basic_short": "20240920T172146", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3293, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 544, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794816000, "block_size": 4096, "block_total": 65519099, "block_available": 63914750, "block_used": 1604349, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_loadavg": {"1m": 0.4501953125, "5m": 0.39111328125, "15m": 0.2021484375}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixe<<< 18662 1726867306.70870: stdout chunk (state=3): >>>d]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18662 1726867306.71450: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 18662 1726867306.71471: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr <<< 18662 1726867306.71764: stdout chunk (state=3): >>># cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch <<< 18662 1726867306.71771: stdout chunk (state=3): >>># cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections <<< 18662 1726867306.71780: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils <<< 18662 1726867306.71783: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai <<< 18662 1726867306.71786: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios <<< 18662 1726867306.71788: stdout chunk (state=3): >>># cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd <<< 18662 1726867306.71853: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 18662 1726867306.72508: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors <<< 18662 1726867306.72532: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 18662 1726867306.72554: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 18662 1726867306.72570: stdout chunk (state=3): >>># destroy _ssl <<< 18662 1726867306.72815: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 18662 1726867306.72937: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools<<< 18662 1726867306.72941: stdout chunk (state=3): >>> # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig <<< 18662 1726867306.72943: stdout chunk (state=3): >>># cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc <<< 18662 1726867306.72946: stdout chunk (state=3): >>># cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 18662 1726867306.72948: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 18662 1726867306.73088: stdout chunk (state=3): >>># destroy sys.monitoring <<< 18662 1726867306.73108: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 18662 1726867306.73138: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 18662 1726867306.73156: stdout chunk (state=3): >>># destroy tokenize <<< 18662 1726867306.73171: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 18662 1726867306.73373: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib # destroy _typing <<< 18662 1726867306.73379: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 18662 1726867306.73385: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 18662 1726867306.73391: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 18662 1726867306.73397: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 18662 1726867306.73423: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 18662 1726867306.73434: stdout chunk (state=3): >>># destroy _hashlib <<< 18662 1726867306.73700: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 18662 1726867306.73843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867306.73890: stderr chunk (state=3): >>><<< 18662 1726867306.73893: stdout chunk (state=3): >>><<< 18662 1726867306.74374: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd1bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd18bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd1bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd1cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd1cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcfcbdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcfcbfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd003800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd003e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcfe3aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcfe11c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcfc8f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd023770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd022390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcfe2090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd020ad0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd058800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcfc8200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fd058cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd058b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fd058ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcfc6d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd059550> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd059220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd05a450> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd070680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fd071d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd072c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fd073260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd072150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fd073ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd073410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd05a4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fcd73bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fcd9c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcd9c410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fcd9c6e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fcd9d010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fcd9d9d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcd9c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcd71d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcd9edb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcd9daf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fd05aba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcdcb140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcdeb500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fce4c290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fce4e9f0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fce4c3b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fce112b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc7253a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcdea300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fcd9fd10> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f66fcdea660> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_raxhq18q/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc78b0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc769fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc769100> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc788f50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc7ba960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc7ba720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc7ba030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc7ba480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc78bad0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc7bb6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc7bb800> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc7bbd10> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc625a90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc6276b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc627f80> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc6291f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc62bce0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fcfc6cc0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc629fa0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc633d40> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc632810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc632570> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc632ae0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc62a4b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc677fb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc678080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc679b50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc679910> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc67c0e0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc67a240> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc67f890> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc67c260> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc6808f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc680a70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc680a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc678230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc5081d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc509490> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc682960> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc683d10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc6825d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc5114c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc512180> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5095b0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5121b0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5132c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc51dcd0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc51b080> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc6066c0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc6fe390> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc51ddf0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5107d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5b1d90> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc19bc50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc19bf80> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5b33e0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5b2900> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5b0560> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5b1dc0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc1b2f30> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc1b27e0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc1b29c0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc1b1c10> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc1b3050> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc209b50> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc1b3b30> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc5b01a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc20b320> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc20a4e0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc249d60> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc22ec00> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc251ac0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc253470> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f66fc04e3c0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc04fc80> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc047f80> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc0962a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc095070> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc097350> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f66fc095f70> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "46", "epoch": "1726867306", "epoch_int": "1726867306", "date": "2024-09-20", "time": "17:21:46", "iso8601_micro": "2024-09-20T21:21:46.352625Z", "iso8601": "2024-09-20T21:21:46Z", "iso8601_basic": "20240920T172146352625", "iso8601_basic_short": "20240920T172146", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3293, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 544, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794816000, "block_size": 4096, "block_total": 65519099, "block_available": 63914750, "block_used": 1604349, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_loadavg": {"1m": 0.4501953125, "5m": 0.39111328125, "15m": 0.2021484375}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 18662 1726867306.76687: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867305.4221702-18670-233934775389515/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867306.76690: _low_level_execute_command(): starting 18662 1726867306.76693: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867305.4221702-18670-233934775389515/ > /dev/null 2>&1 && sleep 0' 18662 1726867306.78082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867306.78086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867306.78088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867306.78091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867306.80003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867306.80006: stderr chunk (state=3): >>><<< 18662 1726867306.80008: stdout chunk (state=3): >>><<< 18662 1726867306.80011: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867306.80047: handler run complete 18662 1726867306.80138: variable 'ansible_facts' from source: unknown 18662 1726867306.80349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867306.81031: variable 'ansible_facts' from source: unknown 18662 1726867306.81309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867306.81569: attempt loop complete, returning result 18662 1726867306.81572: _execute() done 18662 1726867306.81574: dumping result to json 18662 1726867306.81597: done dumping result, returning 18662 1726867306.81605: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-efab-a8ce-00000000007c] 18662 1726867306.81608: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000007c 18662 1726867306.82892: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000007c 18662 1726867306.82895: WORKER PROCESS EXITING ok: [managed_node2] 18662 1726867306.83618: no more pending results, returning what we have 18662 1726867306.83621: results queue empty 18662 1726867306.83622: checking for any_errors_fatal 18662 1726867306.83623: done checking for any_errors_fatal 18662 1726867306.83624: checking for max_fail_percentage 18662 1726867306.83625: done checking for max_fail_percentage 18662 1726867306.83626: checking to see if all hosts have failed and the running result is not ok 18662 1726867306.83626: done checking to see if all hosts have failed 18662 1726867306.83627: getting the remaining hosts for this loop 18662 1726867306.83629: done getting the remaining hosts for this loop 18662 1726867306.83632: getting the next task for host managed_node2 18662 1726867306.83638: done getting next task for host managed_node2 18662 1726867306.83640: ^ task is: TASK: meta (flush_handlers) 18662 1726867306.83642: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867306.83645: getting variables 18662 1726867306.83647: in VariableManager get_vars() 18662 1726867306.83668: Calling all_inventory to load vars for managed_node2 18662 1726867306.83671: Calling groups_inventory to load vars for managed_node2 18662 1726867306.83674: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867306.83685: Calling all_plugins_play to load vars for managed_node2 18662 1726867306.83688: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867306.83691: Calling groups_plugins_play to load vars for managed_node2 18662 1726867306.84501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867306.84983: done with get_vars() 18662 1726867306.84993: done getting variables 18662 1726867306.85057: in VariableManager get_vars() 18662 1726867306.85131: Calling all_inventory to load vars for managed_node2 18662 1726867306.85134: Calling groups_inventory to load vars for managed_node2 18662 1726867306.85136: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867306.85141: Calling all_plugins_play to load vars for managed_node2 18662 1726867306.85143: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867306.85146: Calling groups_plugins_play to load vars for managed_node2 18662 1726867306.85695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867306.86203: done with get_vars() 18662 1726867306.86219: done queuing things up, now waiting for results queue to drain 18662 1726867306.86221: results queue empty 18662 1726867306.86222: checking for any_errors_fatal 18662 1726867306.86224: done checking for any_errors_fatal 18662 1726867306.86225: checking for max_fail_percentage 18662 1726867306.86226: done checking for max_fail_percentage 18662 1726867306.86227: checking to see if all hosts have failed and the running result is not ok 18662 1726867306.86228: done checking to see if all hosts have failed 18662 1726867306.86233: getting the remaining hosts for this loop 18662 1726867306.86234: done getting the remaining hosts for this loop 18662 1726867306.86237: getting the next task for host managed_node2 18662 1726867306.86241: done getting next task for host managed_node2 18662 1726867306.86244: ^ task is: TASK: Include the task 'el_repo_setup.yml' 18662 1726867306.86245: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867306.86247: getting variables 18662 1726867306.86248: in VariableManager get_vars() 18662 1726867306.86257: Calling all_inventory to load vars for managed_node2 18662 1726867306.86259: Calling groups_inventory to load vars for managed_node2 18662 1726867306.86261: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867306.86265: Calling all_plugins_play to load vars for managed_node2 18662 1726867306.86267: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867306.86270: Calling groups_plugins_play to load vars for managed_node2 18662 1726867306.86830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867306.87468: done with get_vars() 18662 1726867306.87476: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:11 Friday 20 September 2024 17:21:46 -0400 (0:00:01.508) 0:00:01.516 ****** 18662 1726867306.88059: entering _queue_task() for managed_node2/include_tasks 18662 1726867306.88061: Creating lock for include_tasks 18662 1726867306.88691: worker is 1 (out of 1 available) 18662 1726867306.88926: exiting _queue_task() for managed_node2/include_tasks 18662 1726867306.88939: done queuing things up, now waiting for results queue to drain 18662 1726867306.88940: waiting for pending results... 18662 1726867306.89894: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 18662 1726867306.89899: in run() - task 0affcac9-a3a5-efab-a8ce-000000000006 18662 1726867306.89902: variable 'ansible_search_path' from source: unknown 18662 1726867306.89905: calling self._execute() 18662 1726867306.90583: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867306.90588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867306.90591: variable 'omit' from source: magic vars 18662 1726867306.90593: _execute() done 18662 1726867306.90596: dumping result to json 18662 1726867306.90598: done dumping result, returning 18662 1726867306.90601: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [0affcac9-a3a5-efab-a8ce-000000000006] 18662 1726867306.90603: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000006 18662 1726867306.90676: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000006 18662 1726867306.90683: WORKER PROCESS EXITING 18662 1726867306.90726: no more pending results, returning what we have 18662 1726867306.90731: in VariableManager get_vars() 18662 1726867306.90760: Calling all_inventory to load vars for managed_node2 18662 1726867306.90762: Calling groups_inventory to load vars for managed_node2 18662 1726867306.90769: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867306.90784: Calling all_plugins_play to load vars for managed_node2 18662 1726867306.90787: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867306.90790: Calling groups_plugins_play to load vars for managed_node2 18662 1726867306.91163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867306.91836: done with get_vars() 18662 1726867306.91844: variable 'ansible_search_path' from source: unknown 18662 1726867306.92084: we have included files to process 18662 1726867306.92086: generating all_blocks data 18662 1726867306.92087: done generating all_blocks data 18662 1726867306.92088: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18662 1726867306.92089: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18662 1726867306.92092: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 18662 1726867306.94824: in VariableManager get_vars() 18662 1726867306.94841: done with get_vars() 18662 1726867306.94855: done processing included file 18662 1726867306.94857: iterating over new_blocks loaded from include file 18662 1726867306.94859: in VariableManager get_vars() 18662 1726867306.94959: done with get_vars() 18662 1726867306.94961: filtering new block on tags 18662 1726867306.94976: done filtering new block on tags 18662 1726867306.94982: in VariableManager get_vars() 18662 1726867306.94992: done with get_vars() 18662 1726867306.94994: filtering new block on tags 18662 1726867306.95013: done filtering new block on tags 18662 1726867306.95016: in VariableManager get_vars() 18662 1726867306.95027: done with get_vars() 18662 1726867306.95028: filtering new block on tags 18662 1726867306.95040: done filtering new block on tags 18662 1726867306.95042: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 18662 1726867306.95047: extending task lists for all hosts with included blocks 18662 1726867306.96002: done extending task lists 18662 1726867306.96004: done processing included files 18662 1726867306.96005: results queue empty 18662 1726867306.96005: checking for any_errors_fatal 18662 1726867306.96007: done checking for any_errors_fatal 18662 1726867306.96007: checking for max_fail_percentage 18662 1726867306.96011: done checking for max_fail_percentage 18662 1726867306.96012: checking to see if all hosts have failed and the running result is not ok 18662 1726867306.96013: done checking to see if all hosts have failed 18662 1726867306.96013: getting the remaining hosts for this loop 18662 1726867306.96015: done getting the remaining hosts for this loop 18662 1726867306.96017: getting the next task for host managed_node2 18662 1726867306.96021: done getting next task for host managed_node2 18662 1726867306.96023: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 18662 1726867306.96025: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867306.96027: getting variables 18662 1726867306.96028: in VariableManager get_vars() 18662 1726867306.96036: Calling all_inventory to load vars for managed_node2 18662 1726867306.96038: Calling groups_inventory to load vars for managed_node2 18662 1726867306.96041: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867306.96046: Calling all_plugins_play to load vars for managed_node2 18662 1726867306.96048: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867306.96051: Calling groups_plugins_play to load vars for managed_node2 18662 1726867306.96532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867306.97291: done with get_vars() 18662 1726867306.97301: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 17:21:46 -0400 (0:00:00.093) 0:00:01.609 ****** 18662 1726867306.97373: entering _queue_task() for managed_node2/setup 18662 1726867306.98080: worker is 1 (out of 1 available) 18662 1726867306.98091: exiting _queue_task() for managed_node2/setup 18662 1726867306.98104: done queuing things up, now waiting for results queue to drain 18662 1726867306.98106: waiting for pending results... 18662 1726867306.98755: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 18662 1726867306.98958: in run() - task 0affcac9-a3a5-efab-a8ce-00000000008d 18662 1726867306.98998: variable 'ansible_search_path' from source: unknown 18662 1726867306.99010: variable 'ansible_search_path' from source: unknown 18662 1726867306.99185: calling self._execute() 18662 1726867306.99242: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867306.99425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867306.99430: variable 'omit' from source: magic vars 18662 1726867307.00785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867307.04690: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867307.04764: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867307.04854: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867307.04963: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867307.05062: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867307.05267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867307.05303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867307.05381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867307.05451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867307.05596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867307.05931: variable 'ansible_facts' from source: unknown 18662 1726867307.06088: variable 'network_test_required_facts' from source: task vars 18662 1726867307.06335: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 18662 1726867307.06339: variable 'omit' from source: magic vars 18662 1726867307.06341: variable 'omit' from source: magic vars 18662 1726867307.06447: variable 'omit' from source: magic vars 18662 1726867307.06479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867307.06514: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867307.06540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867307.06571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867307.06634: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867307.06697: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867307.06778: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867307.06789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867307.07094: Set connection var ansible_timeout to 10 18662 1726867307.07097: Set connection var ansible_connection to ssh 18662 1726867307.07100: Set connection var ansible_shell_executable to /bin/sh 18662 1726867307.07102: Set connection var ansible_shell_type to sh 18662 1726867307.07105: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867307.07107: Set connection var ansible_pipelining to False 18662 1726867307.07113: variable 'ansible_shell_executable' from source: unknown 18662 1726867307.07116: variable 'ansible_connection' from source: unknown 18662 1726867307.07118: variable 'ansible_module_compression' from source: unknown 18662 1726867307.07120: variable 'ansible_shell_type' from source: unknown 18662 1726867307.07122: variable 'ansible_shell_executable' from source: unknown 18662 1726867307.07124: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867307.07126: variable 'ansible_pipelining' from source: unknown 18662 1726867307.07128: variable 'ansible_timeout' from source: unknown 18662 1726867307.07130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867307.07471: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867307.07490: variable 'omit' from source: magic vars 18662 1726867307.07502: starting attempt loop 18662 1726867307.07533: running the handler 18662 1726867307.07549: _low_level_execute_command(): starting 18662 1726867307.07751: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867307.09082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867307.09191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867307.09406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867307.09492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867307.11171: stdout chunk (state=3): >>>/root <<< 18662 1726867307.11369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867307.11390: stdout chunk (state=3): >>><<< 18662 1726867307.11432: stderr chunk (state=3): >>><<< 18662 1726867307.11499: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867307.11536: _low_level_execute_command(): starting 18662 1726867307.11548: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867307.115153-18724-2201461015486 `" && echo ansible-tmp-1726867307.115153-18724-2201461015486="` echo /root/.ansible/tmp/ansible-tmp-1726867307.115153-18724-2201461015486 `" ) && sleep 0' 18662 1726867307.12747: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867307.12948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867307.13330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867307.13417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867307.15486: stdout chunk (state=3): >>>ansible-tmp-1726867307.115153-18724-2201461015486=/root/.ansible/tmp/ansible-tmp-1726867307.115153-18724-2201461015486 <<< 18662 1726867307.15560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867307.15563: stdout chunk (state=3): >>><<< 18662 1726867307.15566: stderr chunk (state=3): >>><<< 18662 1726867307.15617: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867307.115153-18724-2201461015486=/root/.ansible/tmp/ansible-tmp-1726867307.115153-18724-2201461015486 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867307.15889: variable 'ansible_module_compression' from source: unknown 18662 1726867307.15892: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18662 1726867307.15996: variable 'ansible_facts' from source: unknown 18662 1726867307.16315: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867307.115153-18724-2201461015486/AnsiballZ_setup.py 18662 1726867307.16446: Sending initial data 18662 1726867307.16489: Sent initial data (151 bytes) 18662 1726867307.17102: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867307.17195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867307.17228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867307.17249: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867307.17316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867307.17331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867307.18967: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867307.18998: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867307.19111: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpdfrhavle /root/.ansible/tmp/ansible-tmp-1726867307.115153-18724-2201461015486/AnsiballZ_setup.py <<< 18662 1726867307.19115: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867307.115153-18724-2201461015486/AnsiballZ_setup.py" <<< 18662 1726867307.19118: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpdfrhavle" to remote "/root/.ansible/tmp/ansible-tmp-1726867307.115153-18724-2201461015486/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867307.115153-18724-2201461015486/AnsiballZ_setup.py" <<< 18662 1726867307.21585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867307.21594: stdout chunk (state=3): >>><<< 18662 1726867307.21597: stderr chunk (state=3): >>><<< 18662 1726867307.21600: done transferring module to remote 18662 1726867307.21602: _low_level_execute_command(): starting 18662 1726867307.21604: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867307.115153-18724-2201461015486/ /root/.ansible/tmp/ansible-tmp-1726867307.115153-18724-2201461015486/AnsiballZ_setup.py && sleep 0' 18662 1726867307.22771: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867307.22789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867307.22811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867307.22874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867307.22966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867307.23039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867307.23059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867307.23190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867307.25104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867307.25169: stderr chunk (state=3): >>><<< 18662 1726867307.25172: stdout chunk (state=3): >>><<< 18662 1726867307.25203: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867307.25302: _low_level_execute_command(): starting 18662 1726867307.25305: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867307.115153-18724-2201461015486/AnsiballZ_setup.py && sleep 0' 18662 1726867307.26434: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867307.26592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867307.26606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867307.26621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867307.26670: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867307.26785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867307.26829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867307.26995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867307.29113: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 18662 1726867307.29136: stdout chunk (state=3): >>>import _imp # builtin <<< 18662 1726867307.29174: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 18662 1726867307.29237: stdout chunk (state=3): >>>import '_io' # <<< 18662 1726867307.29251: stdout chunk (state=3): >>>import 'marshal' # <<< 18662 1726867307.29397: stdout chunk (state=3): >>>import 'posix' # <<< 18662 1726867307.29417: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 18662 1726867307.29439: stdout chunk (state=3): >>>import 'codecs' # <<< 18662 1726867307.29468: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 18662 1726867307.29517: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be2184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be1e7b30> <<< 18662 1726867307.29542: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be21aa50> <<< 18662 1726867307.29565: stdout chunk (state=3): >>>import '_signal' # <<< 18662 1726867307.29588: stdout chunk (state=3): >>>import '_abc' # <<< 18662 1726867307.29616: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 18662 1726867307.29756: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 18662 1726867307.29759: stdout chunk (state=3): >>>import '_collections_abc' # <<< 18662 1726867307.29765: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 18662 1726867307.29816: stdout chunk (state=3): >>>import 'os' # <<< 18662 1726867307.29829: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 18662 1726867307.29869: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 18662 1726867307.29889: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be02d130> <<< 18662 1726867307.29961: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be02dfa0> <<< 18662 1726867307.29990: stdout chunk (state=3): >>>import 'site' # <<< 18662 1726867307.30062: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 18662 1726867307.30399: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 18662 1726867307.30426: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867307.30455: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 18662 1726867307.30493: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 18662 1726867307.30520: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 18662 1726867307.30541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be06bec0> <<< 18662 1726867307.30631: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be06bf80> <<< 18662 1726867307.30642: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 18662 1726867307.30720: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 18662 1726867307.30737: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 18662 1726867307.30760: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0a3830> <<< 18662 1726867307.30798: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0a3ec0> <<< 18662 1726867307.30902: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be083b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0812b0> <<< 18662 1726867307.30987: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be069070> <<< 18662 1726867307.31015: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 18662 1726867307.31071: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 18662 1726867307.31079: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 18662 1726867307.31166: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0c23f0> <<< 18662 1726867307.31247: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be082150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0c0bc0> <<< 18662 1726867307.31254: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 18662 1726867307.31281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0f8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0682f0> <<< 18662 1726867307.31379: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 18662 1726867307.31395: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7be0f8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0f8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7be0f8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be066e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867307.31422: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 18662 1726867307.31453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0f9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0f9370> import 'importlib.machinery' # <<< 18662 1726867307.31519: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0fa540> <<< 18662 1726867307.31640: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 18662 1726867307.31645: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be110740> <<< 18662 1726867307.31671: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867307.31689: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7be111e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 18662 1726867307.31721: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 18662 1726867307.31916: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be112cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7be1132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be112210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7be113d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be1134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0fa4b0> <<< 18662 1726867307.31942: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 18662 1726867307.31962: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 18662 1726867307.31972: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 18662 1726867307.31990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 18662 1726867307.32026: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bde63c50> <<< 18662 1726867307.32057: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 18662 1726867307.32082: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bde8c710> <<< 18662 1726867307.32160: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bde8c470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bde8c740> <<< 18662 1726867307.32185: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 18662 1726867307.32243: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867307.32338: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bde8d070> <<< 18662 1726867307.32529: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bde8da60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bde8c920> <<< 18662 1726867307.32544: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bde61df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 18662 1726867307.32705: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bde8ee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bde8db50> <<< 18662 1726867307.32708: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0fac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 18662 1726867307.32738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 18662 1726867307.32764: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdeb3170> <<< 18662 1726867307.32828: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 18662 1726867307.32927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867307.32944: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdedb500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 18662 1726867307.32969: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 18662 1726867307.33020: stdout chunk (state=3): >>>import 'ntpath' # <<< 18662 1726867307.33066: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdf3c260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 18662 1726867307.33153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 18662 1726867307.33170: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 18662 1726867307.33185: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 18662 1726867307.33244: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdf3e9c0> <<< 18662 1726867307.34020: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdf3c380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdf01250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdd41340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdeda330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bde8fd70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd7bdd415e0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_f8e5tw74/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 18662 1726867307.34047: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 18662 1726867307.34122: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 18662 1726867307.34155: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bddab0b0> import '_typing' # <<< 18662 1726867307.34344: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdd89fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdd89130> # zipimport: zlib available <<< 18662 1726867307.34572: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 18662 1726867307.35832: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.36957: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdda8f80> <<< 18662 1726867307.36987: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867307.37008: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 18662 1726867307.37045: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 18662 1726867307.37073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 18662 1726867307.37271: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bddda990> <<< 18662 1726867307.37301: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bddda780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bddda090> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bddda4e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bddabad0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bdddb740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bdddb890> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 18662 1726867307.37344: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 18662 1726867307.37362: stdout chunk (state=3): >>>import '_locale' # <<< 18662 1726867307.37404: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdddbcb0> import 'pwd' # <<< 18662 1726867307.37453: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 18662 1726867307.37456: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 18662 1726867307.37523: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd731b20> <<< 18662 1726867307.37584: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd733740> <<< 18662 1726867307.37725: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 18662 1726867307.37789: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd733fb0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd734f50> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd737c20> <<< 18662 1726867307.37827: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd737fb0> <<< 18662 1726867307.37853: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd735f10> <<< 18662 1726867307.37928: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 18662 1726867307.37951: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 18662 1726867307.38140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd73fbc0> import '_tokenize' # <<< 18662 1726867307.38460: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd73e690> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd73e3f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd73e960> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd736420> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd783e90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd7838c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 18662 1726867307.38464: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 18662 1726867307.38466: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 18662 1726867307.38492: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd785a60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd785820> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 18662 1726867307.38507: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 18662 1726867307.38541: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd787ef0> <<< 18662 1726867307.38563: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd786090> <<< 18662 1726867307.38579: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 18662 1726867307.38612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867307.38641: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 18662 1726867307.38650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 18662 1726867307.38869: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd78b650> <<< 18662 1726867307.38872: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd787fe0> <<< 18662 1726867307.39001: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd78c440> <<< 18662 1726867307.39004: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd78c470> <<< 18662 1726867307.39059: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd78ca40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd7840e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867307.39082: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd618170> <<< 18662 1726867307.39235: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd619610> <<< 18662 1726867307.39253: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd78e900> <<< 18662 1726867307.39280: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867307.39341: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd78fcb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd78e540> # zipimport: zlib available <<< 18662 1726867307.39556: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 18662 1726867307.39693: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.39811: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.40357: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.40962: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867307.41002: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd61d850> <<< 18662 1726867307.41082: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 18662 1726867307.41215: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd61e660> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd619910> <<< 18662 1726867307.41237: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 18662 1726867307.41510: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.41539: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 18662 1726867307.41814: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd61e630> # zipimport: zlib available <<< 18662 1726867307.42249: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.42967: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.43119: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.43170: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 18662 1726867307.43189: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.43276: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 18662 1726867307.43381: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.43509: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 18662 1726867307.43534: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.43708: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 18662 1726867307.44106: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.44471: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd61f9b0> <<< 18662 1726867307.44555: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.44570: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.44635: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 18662 1726867307.44654: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 18662 1726867307.44681: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.44702: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.44744: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 18662 1726867307.44800: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.44905: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.44959: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 18662 1726867307.45006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867307.45112: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd62a180> <<< 18662 1726867307.45130: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd620c50> <<< 18662 1726867307.45154: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 18662 1726867307.45170: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.45230: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.45288: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.45323: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.45353: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 18662 1726867307.45380: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 18662 1726867307.45434: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 18662 1726867307.45445: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 18662 1726867307.45515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 18662 1726867307.45518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 18662 1726867307.45651: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd7029c0> <<< 18662 1726867307.45655: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd7fe690> <<< 18662 1726867307.45699: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd629ee0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd61c140> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 18662 1726867307.45783: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 18662 1726867307.45806: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 18662 1726867307.45834: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 18662 1726867307.45887: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.46028: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.46031: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.46050: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.46119: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.46224: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 18662 1726867307.46244: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.46350: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.46391: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 18662 1726867307.46759: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.46934: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.47005: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867307.47039: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 18662 1726867307.47066: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 18662 1726867307.47070: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 18662 1726867307.47119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd6ba570> <<< 18662 1726867307.47153: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 18662 1726867307.47171: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 18662 1726867307.47217: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 18662 1726867307.47269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 18662 1726867307.47287: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 18662 1726867307.47320: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd2a8140> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867307.47342: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd2a84a0> <<< 18662 1726867307.47401: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd6a6cf0> <<< 18662 1726867307.47459: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd6bb050> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd6b8c80> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd6b87d0> <<< 18662 1726867307.47499: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 18662 1726867307.47566: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 18662 1726867307.47602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 18662 1726867307.47614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 18662 1726867307.47658: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd2ab410> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd2aacc0> <<< 18662 1726867307.47699: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd2aaea0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd2aa0f0> <<< 18662 1726867307.47722: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 18662 1726867307.47897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd2ab560> <<< 18662 1726867307.47927: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 18662 1726867307.47989: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd2f6060> <<< 18662 1726867307.48082: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd2abcb0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd6b8920> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 18662 1726867307.48185: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 18662 1726867307.48491: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # <<< 18662 1726867307.48533: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 18662 1726867307.48556: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # <<< 18662 1726867307.48631: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.48688: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 18662 1726867307.48765: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.48805: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 18662 1726867307.48843: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.48898: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.49062: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.49132: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 18662 1726867307.49166: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 18662 1726867307.49972: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.50525: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.50561: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.50594: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.50643: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 18662 1726867307.50646: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.50673: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.50740: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 18662 1726867307.50766: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.50822: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 18662 1726867307.50879: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.50899: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 18662 1726867307.50980: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # <<< 18662 1726867307.50984: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.51087: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.51143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 18662 1726867307.51301: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd2f7380> <<< 18662 1726867307.51317: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 18662 1726867307.51580: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd2f6c00> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 18662 1726867307.51794: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.51810: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 18662 1726867307.51854: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.51897: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 18662 1726867307.51945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 18662 1726867307.52044: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867307.52084: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd3362a0> <<< 18662 1726867307.52391: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd327140> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 18662 1726867307.52428: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 18662 1726867307.52897: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.52921: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 18662 1726867307.53023: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.53027: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 18662 1726867307.53120: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd349c40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd335850> import 'ansible.module_utils.facts.system.user' # <<< 18662 1726867307.53123: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.53125: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 18662 1726867307.53128: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.53246: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.53260: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 18662 1726867307.53265: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.53586: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # <<< 18662 1726867307.53594: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.53615: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.53725: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.53800: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 18662 1726867307.53950: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.53953: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.53993: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.54313: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.54373: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 18662 1726867307.54414: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.54480: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.55048: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.55697: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.55749: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 18662 1726867307.55752: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.55853: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.55945: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 18662 1726867307.55990: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.56101: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.56256: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 18662 1726867307.56354: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.56399: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 18662 1726867307.56797: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.56981: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 18662 1726867307.57204: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.57275: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 18662 1726867307.57283: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.57383: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 18662 1726867307.57396: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.57452: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 18662 1726867307.57519: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.57565: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 18662 1726867307.57579: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.57892: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.58098: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 18662 1726867307.58116: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.58306: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # <<< 18662 1726867307.58312: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.58334: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.58367: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 18662 1726867307.58432: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.58445: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 18662 1726867307.58541: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.58606: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 18662 1726867307.58629: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.58906: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.58953: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.59028: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 18662 1726867307.59047: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 18662 1726867307.59090: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.59171: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 18662 1726867307.59485: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.59695: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.59743: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 18662 1726867307.59747: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.59828: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.60024: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867307.60107: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 18662 1726867307.60113: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 18662 1726867307.60294: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867307.61161: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 18662 1726867307.61188: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 18662 1726867307.61211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 18662 1726867307.61247: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd14bc80> <<< 18662 1726867307.61274: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd149370> <<< 18662 1726867307.61351: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd148e30> <<< 18662 1726867307.61772: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "47", "epoch": "1726867307", "epoch_int": "1726867307", "date": "2024-09-20", "time": "17:21:47", "iso8601_micro": "2024-09-20T21:21:47.606899Z", "iso8601": "2024-09-20T21:21:47Z", "iso8601_basic": "20240920T172147606899", "iso8601_basic_short": "20240920T172147", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr<<< 18662 1726867307.61784: stdout chunk (state=3): >>>/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18662 1726867307.62334: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools <<< 18662 1726867307.62387: stdout chunk (state=3): >>># cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache <<< 18662 1726867307.62416: stdout chunk (state=3): >>># cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves <<< 18662 1726867307.62647: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 18662 1726867307.62656: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue <<< 18662 1726867307.62659: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd <<< 18662 1726867307.62662: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly <<< 18662 1726867307.62664: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns<<< 18662 1726867307.62668: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user<<< 18662 1726867307.62691: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base <<< 18662 1726867307.62695: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 18662 1726867307.62992: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 18662 1726867307.63269: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal <<< 18662 1726867307.63338: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime <<< 18662 1726867307.63367: stdout chunk (state=3): >>># destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 18662 1726867307.63373: stdout chunk (state=3): >>># destroy errno # destroy json <<< 18662 1726867307.63500: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 18662 1726867307.63506: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 18662 1726867307.63512: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 18662 1726867307.63515: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 18662 1726867307.63522: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 18662 1726867307.63663: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 18662 1726867307.63667: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 18662 1726867307.63812: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 18662 1726867307.63902: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 18662 1726867307.63920: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 18662 1726867307.64018: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 18662 1726867307.64124: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc <<< 18662 1726867307.64683: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 18662 1726867307.64686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867307.64689: stdout chunk (state=3): >>><<< 18662 1726867307.64691: stderr chunk (state=3): >>><<< 18662 1726867307.65110: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be2184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be1e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be21aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be02d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be02dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be06bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be06bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0a3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be083b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be069070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be082150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0c0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0f8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7be0f8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0f8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7be0f8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be066e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0f9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0f9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0fa540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be110740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7be111e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be112cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7be1132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be112210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7be113d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be1134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0fa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bde63c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bde8c710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bde8c470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bde8c740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bde8d070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bde8da60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bde8c920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bde61df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bde8ee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bde8db50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7be0fac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdeb3170> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdedb500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdf3c260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdf3e9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdf3c380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdf01250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdd41340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdeda330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bde8fd70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd7bdd415e0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_f8e5tw74/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bddab0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdd89fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdd89130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdda8f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bddda990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bddda780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bddda090> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bddda4e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bddabad0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bdddb740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bdddb890> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bdddbcb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd731b20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd733740> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd733fb0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd734f50> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd737c20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd737fb0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd735f10> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd73fbc0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd73e690> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd73e3f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd73e960> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd736420> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd783e90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd7838c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd785a60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd785820> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd787ef0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd786090> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd78b650> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd787fe0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd78c440> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd78c470> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd78ca40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd7840e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd618170> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd619610> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd78e900> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd78fcb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd78e540> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd61d850> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd61e660> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd619910> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd61e630> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd61f9b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd62a180> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd620c50> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd7029c0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd7fe690> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd629ee0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd61c140> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd6ba570> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd2a8140> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd2a84a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd6a6cf0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd6bb050> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd6b8c80> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd6b87d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd2ab410> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd2aacc0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd2aaea0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd2aa0f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd2ab560> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd2f6060> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd2abcb0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd6b8920> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd2f7380> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd2f6c00> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd3362a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd327140> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd349c40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd335850> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd7bd14bc80> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd149370> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd7bd148e30> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "47", "epoch": "1726867307", "epoch_int": "1726867307", "date": "2024-09-20", "time": "17:21:47", "iso8601_micro": "2024-09-20T21:21:47.606899Z", "iso8601": "2024-09-20T21:21:47Z", "iso8601_basic": "20240920T172147606899", "iso8601_basic_short": "20240920T172147", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 18662 1726867307.67131: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867307.115153-18724-2201461015486/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867307.67135: _low_level_execute_command(): starting 18662 1726867307.67137: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867307.115153-18724-2201461015486/ > /dev/null 2>&1 && sleep 0' 18662 1726867307.67139: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867307.67142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867307.67144: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867307.67146: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867307.67148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867307.67150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867307.67152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867307.67221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867307.68905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867307.68930: stderr chunk (state=3): >>><<< 18662 1726867307.68979: stdout chunk (state=3): >>><<< 18662 1726867307.68993: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867307.69003: handler run complete 18662 1726867307.69283: variable 'ansible_facts' from source: unknown 18662 1726867307.69288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867307.69738: variable 'ansible_facts' from source: unknown 18662 1726867307.69741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867307.69789: attempt loop complete, returning result 18662 1726867307.69797: _execute() done 18662 1726867307.69805: dumping result to json 18662 1726867307.69825: done dumping result, returning 18662 1726867307.69857: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcac9-a3a5-efab-a8ce-00000000008d] 18662 1726867307.70066: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000008d 18662 1726867307.70156: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000008d 18662 1726867307.70159: WORKER PROCESS EXITING ok: [managed_node2] 18662 1726867307.70274: no more pending results, returning what we have 18662 1726867307.70279: results queue empty 18662 1726867307.70280: checking for any_errors_fatal 18662 1726867307.70282: done checking for any_errors_fatal 18662 1726867307.70283: checking for max_fail_percentage 18662 1726867307.70284: done checking for max_fail_percentage 18662 1726867307.70285: checking to see if all hosts have failed and the running result is not ok 18662 1726867307.70286: done checking to see if all hosts have failed 18662 1726867307.70286: getting the remaining hosts for this loop 18662 1726867307.70288: done getting the remaining hosts for this loop 18662 1726867307.70291: getting the next task for host managed_node2 18662 1726867307.70300: done getting next task for host managed_node2 18662 1726867307.70303: ^ task is: TASK: Check if system is ostree 18662 1726867307.70306: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867307.70312: getting variables 18662 1726867307.70314: in VariableManager get_vars() 18662 1726867307.70344: Calling all_inventory to load vars for managed_node2 18662 1726867307.70346: Calling groups_inventory to load vars for managed_node2 18662 1726867307.70349: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867307.70360: Calling all_plugins_play to load vars for managed_node2 18662 1726867307.70364: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867307.70368: Calling groups_plugins_play to load vars for managed_node2 18662 1726867307.71193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867307.71629: done with get_vars() 18662 1726867307.71639: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 17:21:47 -0400 (0:00:00.744) 0:00:02.353 ****** 18662 1726867307.71850: entering _queue_task() for managed_node2/stat 18662 1726867307.72315: worker is 1 (out of 1 available) 18662 1726867307.72444: exiting _queue_task() for managed_node2/stat 18662 1726867307.72456: done queuing things up, now waiting for results queue to drain 18662 1726867307.72457: waiting for pending results... 18662 1726867307.73000: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 18662 1726867307.73005: in run() - task 0affcac9-a3a5-efab-a8ce-00000000008f 18662 1726867307.73010: variable 'ansible_search_path' from source: unknown 18662 1726867307.73013: variable 'ansible_search_path' from source: unknown 18662 1726867307.73121: calling self._execute() 18662 1726867307.73282: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867307.73316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867307.73331: variable 'omit' from source: magic vars 18662 1726867307.73838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867307.74104: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867307.74154: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867307.74226: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867307.74264: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867307.74370: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867307.74410: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867307.74439: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867307.74467: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867307.74636: Evaluated conditional (not __network_is_ostree is defined): True 18662 1726867307.74653: variable 'omit' from source: magic vars 18662 1726867307.74700: variable 'omit' from source: magic vars 18662 1726867307.74783: variable 'omit' from source: magic vars 18662 1726867307.74791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867307.74826: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867307.74864: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867307.74951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867307.74954: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867307.74957: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867307.74959: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867307.74962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867307.75072: Set connection var ansible_timeout to 10 18662 1726867307.75083: Set connection var ansible_connection to ssh 18662 1726867307.75094: Set connection var ansible_shell_executable to /bin/sh 18662 1726867307.75101: Set connection var ansible_shell_type to sh 18662 1726867307.75119: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867307.75129: Set connection var ansible_pipelining to False 18662 1726867307.75164: variable 'ansible_shell_executable' from source: unknown 18662 1726867307.75176: variable 'ansible_connection' from source: unknown 18662 1726867307.75278: variable 'ansible_module_compression' from source: unknown 18662 1726867307.75282: variable 'ansible_shell_type' from source: unknown 18662 1726867307.75285: variable 'ansible_shell_executable' from source: unknown 18662 1726867307.75287: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867307.75289: variable 'ansible_pipelining' from source: unknown 18662 1726867307.75291: variable 'ansible_timeout' from source: unknown 18662 1726867307.75293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867307.75390: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867307.75482: variable 'omit' from source: magic vars 18662 1726867307.75492: starting attempt loop 18662 1726867307.75495: running the handler 18662 1726867307.75498: _low_level_execute_command(): starting 18662 1726867307.75500: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867307.76200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867307.76221: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867307.76268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867307.76293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867307.76389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867307.76423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867307.76454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867307.76485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867307.76983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18662 1726867307.78488: stdout chunk (state=3): >>>/root <<< 18662 1726867307.78664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867307.78676: stdout chunk (state=3): >>><<< 18662 1726867307.78715: stderr chunk (state=3): >>><<< 18662 1726867307.78740: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18662 1726867307.78774: _low_level_execute_command(): starting 18662 1726867307.78801: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867307.7876077-18749-112723272994151 `" && echo ansible-tmp-1726867307.7876077-18749-112723272994151="` echo /root/.ansible/tmp/ansible-tmp-1726867307.7876077-18749-112723272994151 `" ) && sleep 0' 18662 1726867307.79339: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867307.79353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867307.79373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867307.79394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867307.79412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867307.79425: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867307.79519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867307.79539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867307.79615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18662 1726867307.82508: stdout chunk (state=3): >>>ansible-tmp-1726867307.7876077-18749-112723272994151=/root/.ansible/tmp/ansible-tmp-1726867307.7876077-18749-112723272994151 <<< 18662 1726867307.82596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867307.82612: stdout chunk (state=3): >>><<< 18662 1726867307.82638: stderr chunk (state=3): >>><<< 18662 1726867307.82659: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867307.7876077-18749-112723272994151=/root/.ansible/tmp/ansible-tmp-1726867307.7876077-18749-112723272994151 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18662 1726867307.82732: variable 'ansible_module_compression' from source: unknown 18662 1726867307.83089: ANSIBALLZ: Using lock for stat 18662 1726867307.83092: ANSIBALLZ: Acquiring lock 18662 1726867307.83095: ANSIBALLZ: Lock acquired: 140264020906912 18662 1726867307.83097: ANSIBALLZ: Creating module 18662 1726867307.97673: ANSIBALLZ: Writing module into payload 18662 1726867307.97736: ANSIBALLZ: Writing module 18662 1726867307.97751: ANSIBALLZ: Renaming module 18662 1726867307.97756: ANSIBALLZ: Done creating module 18662 1726867307.97770: variable 'ansible_facts' from source: unknown 18662 1726867307.97825: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867307.7876077-18749-112723272994151/AnsiballZ_stat.py 18662 1726867307.97929: Sending initial data 18662 1726867307.97932: Sent initial data (153 bytes) 18662 1726867307.98458: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867307.98562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867307.98601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18662 1726867308.01049: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867308.01127: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867308.01282: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmp2xfk5u7m /root/.ansible/tmp/ansible-tmp-1726867307.7876077-18749-112723272994151/AnsiballZ_stat.py <<< 18662 1726867308.01286: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867307.7876077-18749-112723272994151/AnsiballZ_stat.py" <<< 18662 1726867308.01320: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmp2xfk5u7m" to remote "/root/.ansible/tmp/ansible-tmp-1726867307.7876077-18749-112723272994151/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867307.7876077-18749-112723272994151/AnsiballZ_stat.py" <<< 18662 1726867308.02544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867308.02553: stdout chunk (state=3): >>><<< 18662 1726867308.02566: stderr chunk (state=3): >>><<< 18662 1726867308.02627: done transferring module to remote 18662 1726867308.02698: _low_level_execute_command(): starting 18662 1726867308.02805: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867307.7876077-18749-112723272994151/ /root/.ansible/tmp/ansible-tmp-1726867307.7876077-18749-112723272994151/AnsiballZ_stat.py && sleep 0' 18662 1726867308.04181: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867308.04185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867308.04187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867308.04189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867308.04206: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867308.04220: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867308.04314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867308.04512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867308.04553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18662 1726867308.07246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867308.07257: stdout chunk (state=3): >>><<< 18662 1726867308.07270: stderr chunk (state=3): >>><<< 18662 1726867308.07471: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18662 1726867308.07475: _low_level_execute_command(): starting 18662 1726867308.07481: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867307.7876077-18749-112723272994151/AnsiballZ_stat.py && sleep 0' 18662 1726867308.08240: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867308.08260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867308.08374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867308.08398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867308.08493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867308.08518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867308.08551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867308.08580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867308.08692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18662 1726867308.12022: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 18662 1726867308.12087: stdout chunk (state=3): >>>import '_io' # <<< 18662 1726867308.12090: stdout chunk (state=3): >>> import 'marshal' # <<< 18662 1726867308.12093: stdout chunk (state=3): >>> <<< 18662 1726867308.12135: stdout chunk (state=3): >>>import 'posix' # <<< 18662 1726867308.12181: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 18662 1726867308.12199: stdout chunk (state=3): >>> # installing zipimport hook<<< 18662 1726867308.12229: stdout chunk (state=3): >>> import 'time' # <<< 18662 1726867308.12244: stdout chunk (state=3): >>>import 'zipimport' # <<< 18662 1726867308.12259: stdout chunk (state=3): >>> # installed zipimport hook<<< 18662 1726867308.12297: stdout chunk (state=3): >>> <<< 18662 1726867308.12330: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py<<< 18662 1726867308.12404: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867308.12435: stdout chunk (state=3): >>>import '_codecs' # <<< 18662 1726867308.12506: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 18662 1726867308.12517: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 18662 1726867308.12539: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36d104d0> <<< 18662 1726867308.12637: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36cdfb30> <<< 18662 1726867308.12649: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 18662 1726867308.12679: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36d12a50> import '_signal' # import '_abc' # import 'abc' # <<< 18662 1726867308.12761: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # <<< 18662 1726867308.12892: stdout chunk (state=3): >>>import '_collections_abc' # <<< 18662 1726867308.12951: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # <<< 18662 1726867308.12965: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 18662 1726867308.12991: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 18662 1726867308.13014: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 18662 1726867308.13114: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 18662 1726867308.13120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 18662 1726867308.13134: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36ae5130> <<< 18662 1726867308.13165: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 18662 1726867308.13181: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36ae5fa0> <<< 18662 1726867308.13247: stdout chunk (state=3): >>>import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 18662 1726867308.13605: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 18662 1726867308.13621: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 18662 1726867308.13705: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 18662 1726867308.13764: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 18662 1726867308.13811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b23ec0> <<< 18662 1726867308.13856: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 18662 1726867308.13887: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b23f80> <<< 18662 1726867308.13939: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 18662 1726867308.13968: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 18662 1726867308.14034: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867308.14061: stdout chunk (state=3): >>>import 'itertools' # <<< 18662 1726867308.14123: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b5b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 18662 1726867308.14126: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b5bec0> <<< 18662 1726867308.14187: stdout chunk (state=3): >>>import '_collections' # <<< 18662 1726867308.14242: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b3bb60> <<< 18662 1726867308.14303: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b392b0> <<< 18662 1726867308.14445: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b21070><<< 18662 1726867308.14457: stdout chunk (state=3): >>> <<< 18662 1726867308.14516: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 18662 1726867308.14532: stdout chunk (state=3): >>>import '_sre' # <<< 18662 1726867308.14594: stdout chunk (state=3): >>> <<< 18662 1726867308.14629: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc'<<< 18662 1726867308.14644: stdout chunk (state=3): >>> <<< 18662 1726867308.14671: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc'<<< 18662 1726867308.14733: stdout chunk (state=3): >>> import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b7b7d0><<< 18662 1726867308.14765: stdout chunk (state=3): >>> <<< 18662 1726867308.14768: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b7a3f0> <<< 18662 1726867308.14820: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 18662 1726867308.14830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b3a150><<< 18662 1726867308.14908: stdout chunk (state=3): >>> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b78bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 18662 1726867308.14952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bb0890> <<< 18662 1726867308.14993: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b202f0> <<< 18662 1726867308.15004: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc'<<< 18662 1726867308.15071: stdout chunk (state=3): >>> # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867308.15089: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36bb0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bb0bf0><<< 18662 1726867308.15133: stdout chunk (state=3): >>> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867308.15168: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36bb0fe0> <<< 18662 1726867308.15196: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b1ee10> <<< 18662 1726867308.15231: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py<<< 18662 1726867308.15234: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc'<<< 18662 1726867308.15275: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 18662 1726867308.15400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc'<<< 18662 1726867308.15404: stdout chunk (state=3): >>> import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bb1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bb1370> import 'importlib.machinery' # <<< 18662 1726867308.15442: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bb2540> <<< 18662 1726867308.15463: stdout chunk (state=3): >>>import 'importlib.util' # <<< 18662 1726867308.15485: stdout chunk (state=3): >>> import 'runpy' # <<< 18662 1726867308.15570: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 18662 1726867308.15647: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 18662 1726867308.15654: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bc8740><<< 18662 1726867308.15681: stdout chunk (state=3): >>> import 'errno' # <<< 18662 1726867308.15725: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 18662 1726867308.15783: stdout chunk (state=3): >>> import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36bc9e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 18662 1726867308.15800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc'<<< 18662 1726867308.15846: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc'<<< 18662 1726867308.15868: stdout chunk (state=3): >>> import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bcacc0> <<< 18662 1726867308.15925: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so'<<< 18662 1726867308.15952: stdout chunk (state=3): >>> # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36bcb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bca210> <<< 18662 1726867308.15992: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 18662 1726867308.16042: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 18662 1726867308.16076: stdout chunk (state=3): >>> import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36bcbd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bcb4a0> <<< 18662 1726867308.16199: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bb24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 18662 1726867308.16236: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py<<< 18662 1726867308.16266: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 18662 1726867308.16320: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867308.16355: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd3694fc50><<< 18662 1726867308.16383: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc'<<< 18662 1726867308.16436: stdout chunk (state=3): >>> # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867308.16441: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867308.16474: stdout chunk (state=3): >>>import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd369787a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36978500><<< 18662 1726867308.16504: stdout chunk (state=3): >>> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867308.16528: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867308.16572: stdout chunk (state=3): >>>import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd369787d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 18662 1726867308.16580: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 18662 1726867308.16682: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 18662 1726867308.16872: stdout chunk (state=3): >>> # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36979100> <<< 18662 1726867308.17074: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 18662 1726867308.17092: stdout chunk (state=3): >>> # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36979af0><<< 18662 1726867308.17132: stdout chunk (state=3): >>> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd369789b0> <<< 18662 1726867308.17166: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3694ddf0> <<< 18662 1726867308.17190: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 18662 1726867308.17218: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 18662 1726867308.17258: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 18662 1726867308.17293: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 18662 1726867308.17306: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3697af00> <<< 18662 1726867308.17353: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36979c40><<< 18662 1726867308.17359: stdout chunk (state=3): >>> <<< 18662 1726867308.17389: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bb2c60> <<< 18662 1726867308.17934: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd369a3230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd369c75f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 18662 1726867308.17937: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36a28380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 18662 1726867308.17940: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 18662 1726867308.17955: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 18662 1726867308.17973: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 18662 1726867308.18062: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36a2aae0> <<< 18662 1726867308.18145: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36a284a0> <<< 18662 1726867308.18172: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd369e9370> <<< 18662 1726867308.18216: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36329430> <<< 18662 1726867308.18235: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd369c63f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3697be00> <<< 18662 1726867308.18512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fdd369c6750> <<< 18662 1726867308.18536: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_41knd26w/ansible_stat_payload.zip' # zipimport: zlib available <<< 18662 1726867308.18670: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.18711: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 18662 1726867308.18736: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 18662 1726867308.18754: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 18662 1726867308.18842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 18662 1726867308.19106: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 18662 1726867308.19133: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3637f170> import '_typing' # <<< 18662 1726867308.19185: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3635e060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3635d1c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available <<< 18662 1726867308.19217: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 18662 1726867308.19237: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 18662 1726867308.19319: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.21473: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.22895: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 18662 1726867308.22988: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3637d040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd363a6ab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd363a6840> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd363a6150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 18662 1726867308.23006: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd363a65a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3637fe00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd363a7830> <<< 18662 1726867308.23025: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd363a7a70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 18662 1726867308.23086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 18662 1726867308.23136: stdout chunk (state=3): >>>import '_locale' # <<< 18662 1726867308.23156: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd363a7fb0> <<< 18662 1726867308.23216: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 18662 1726867308.23228: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 18662 1726867308.23262: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36211d30> <<< 18662 1726867308.23307: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36213470> <<< 18662 1726867308.23331: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 18662 1726867308.23400: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36214320> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 18662 1726867308.23431: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 18662 1726867308.23473: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd362154c0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 18662 1726867308.23524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 18662 1726867308.23587: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 18662 1726867308.23616: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36217fb0> <<< 18662 1726867308.23663: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd3621c2f0> <<< 18662 1726867308.23711: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36216270> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 18662 1726867308.23767: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 18662 1726867308.23788: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 18662 1726867308.23846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 18662 1726867308.23880: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3621fec0> import '_tokenize' # <<< 18662 1726867308.23971: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3621e990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3621e720> <<< 18662 1726867308.24016: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 18662 1726867308.24211: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3621ec60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36216780> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36267fe0> <<< 18662 1726867308.24227: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36268320> <<< 18662 1726867308.24246: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 18662 1726867308.24280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 18662 1726867308.24343: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36269d60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36269b20> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 18662 1726867308.24547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 18662 1726867308.24614: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd3626c260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3626a420> <<< 18662 1726867308.24655: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 18662 1726867308.24711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 18662 1726867308.24731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 18662 1726867308.24818: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3626fa40> <<< 18662 1726867308.24989: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3626c410><<< 18662 1726867308.25084: stdout chunk (state=3): >>> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867308.25122: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd362708c0> <<< 18662 1726867308.25176: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 18662 1726867308.25196: stdout chunk (state=3): >>> # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867308.25279: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd362708f0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867308.25387: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867308.25391: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36270da0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36268470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 18662 1726867308.25433: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 18662 1726867308.25483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 18662 1726867308.25530: stdout chunk (state=3): >>> # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867308.25581: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 18662 1726867308.25623: stdout chunk (state=3): >>> <<< 18662 1726867308.25720: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd362fc320> <<< 18662 1726867308.25854: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 18662 1726867308.25887: stdout chunk (state=3): >>> # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 18662 1726867308.25909: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd362fd640> <<< 18662 1726867308.25985: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36272ab0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36273e60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd362726f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 18662 1726867308.26237: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 18662 1726867308.26240: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 18662 1726867308.26362: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.26475: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.27104: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.27546: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 18662 1726867308.27573: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 18662 1726867308.27606: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867308.27665: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36101880> <<< 18662 1726867308.27758: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36102570> <<< 18662 1726867308.27859: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd362fd760> <<< 18662 1726867308.27885: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 18662 1726867308.28011: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.28188: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 18662 1726867308.28202: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd361023f0> # zipimport: zlib available <<< 18662 1726867308.28692: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.29074: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.29160: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.29312: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867308.29316: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 18662 1726867308.29318: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.29387: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.29464: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 18662 1726867308.29641: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 18662 1726867308.29645: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.29647: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 18662 1726867308.29802: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.30030: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 18662 1726867308.30100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 18662 1726867308.30112: stdout chunk (state=3): >>>import '_ast' # <<< 18662 1726867308.30218: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36103860> # zipimport: zlib available <<< 18662 1726867308.30249: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.30336: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 18662 1726867308.30361: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 18662 1726867308.30397: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.30444: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 18662 1726867308.30562: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 18662 1726867308.30587: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.30681: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 18662 1726867308.30791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867308.30821: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd3610e1b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd361091c0> <<< 18662 1726867308.30852: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 18662 1726867308.30866: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 18662 1726867308.30924: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.30996: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.31013: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.31056: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 18662 1726867308.31104: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 18662 1726867308.31134: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 18662 1726867308.31169: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 18662 1726867308.31220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 18662 1726867308.31258: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 18662 1726867308.31324: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd363faae0> <<< 18662 1726867308.31339: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd363ee7b0> <<< 18662 1726867308.31397: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3610e2d0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36103230> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 18662 1726867308.31440: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.31463: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 18662 1726867308.31684: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 18662 1726867308.31696: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.31867: stdout chunk (state=3): >>># zipimport: zlib available <<< 18662 1726867308.32041: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 18662 1726867308.32337: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc <<< 18662 1726867308.32366: stdout chunk (state=3): >>># clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator <<< 18662 1726867308.32396: stdout chunk (state=3): >>># cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2<<< 18662 1726867308.32439: stdout chunk (state=3): >>> # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl <<< 18662 1726867308.32473: stdout chunk (state=3): >>># cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes <<< 18662 1726867308.32510: stdout chunk (state=3): >>># destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 18662 1726867308.32775: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 18662 1726867308.32831: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath <<< 18662 1726867308.32834: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib <<< 18662 1726867308.32998: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil <<< 18662 1726867308.33032: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 18662 1726867308.33086: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat <<< 18662 1726867308.33203: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 18662 1726867308.33246: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 18662 1726867308.33272: stdout chunk (state=3): >>># destroy _collections <<< 18662 1726867308.33307: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 18662 1726867308.33402: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 18662 1726867308.33425: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 18662 1726867308.33541: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 18662 1726867308.33554: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 18662 1726867308.34324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867308.34327: stdout chunk (state=3): >>><<< 18662 1726867308.34329: stderr chunk (state=3): >>><<< 18662 1726867308.34342: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36d104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36cdfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36d12a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36ae5130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36ae5fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b23ec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b23f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b5b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b5bec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b3bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b392b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b21070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b7b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b7a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b3a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b78bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bb0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b202f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36bb0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bb0bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36bb0fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36b1ee10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bb1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bb1370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bb2540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bc8740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36bc9e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bcacc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36bcb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bca210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36bcbd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bcb4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bb24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd3694fc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd369787a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36978500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd369787d0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36979100> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36979af0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd369789b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3694ddf0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3697af00> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36979c40> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36bb2c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd369a3230> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd369c75f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36a28380> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36a2aae0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36a284a0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd369e9370> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36329430> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd369c63f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3697be00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fdd369c6750> # zipimport: found 30 names in '/tmp/ansible_stat_payload_41knd26w/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3637f170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3635e060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3635d1c0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3637d040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd363a6ab0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd363a6840> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd363a6150> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd363a65a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3637fe00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd363a7830> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd363a7a70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd363a7fb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36211d30> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36213470> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36214320> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd362154c0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36217fb0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd3621c2f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36216270> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3621fec0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3621e990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3621e720> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3621ec60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36216780> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36267fe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36268320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36269d60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36269b20> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd3626c260> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3626a420> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3626fa40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3626c410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd362708c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd362708f0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36270da0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36268470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd362fc320> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd362fd640> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36272ab0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36273e60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd362726f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd36101880> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36102570> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd362fd760> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd361023f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36103860> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fdd3610e1b0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd361091c0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd363faae0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd363ee7b0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd3610e2d0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fdd36103230> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 18662 1726867308.35588: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867307.7876077-18749-112723272994151/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867308.35597: _low_level_execute_command(): starting 18662 1726867308.35600: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867307.7876077-18749-112723272994151/ > /dev/null 2>&1 && sleep 0' 18662 1726867308.36212: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867308.36220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867308.36223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867308.36284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867308.36312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867308.38229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867308.38256: stderr chunk (state=3): >>><<< 18662 1726867308.38262: stdout chunk (state=3): >>><<< 18662 1726867308.38311: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867308.38315: handler run complete 18662 1726867308.38317: attempt loop complete, returning result 18662 1726867308.38320: _execute() done 18662 1726867308.38322: dumping result to json 18662 1726867308.38324: done dumping result, returning 18662 1726867308.38337: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0affcac9-a3a5-efab-a8ce-00000000008f] 18662 1726867308.38340: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000008f 18662 1726867308.38509: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000008f 18662 1726867308.38512: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 18662 1726867308.38614: no more pending results, returning what we have 18662 1726867308.38617: results queue empty 18662 1726867308.38618: checking for any_errors_fatal 18662 1726867308.38665: done checking for any_errors_fatal 18662 1726867308.38667: checking for max_fail_percentage 18662 1726867308.38669: done checking for max_fail_percentage 18662 1726867308.38670: checking to see if all hosts have failed and the running result is not ok 18662 1726867308.38671: done checking to see if all hosts have failed 18662 1726867308.38671: getting the remaining hosts for this loop 18662 1726867308.38673: done getting the remaining hosts for this loop 18662 1726867308.38678: getting the next task for host managed_node2 18662 1726867308.38686: done getting next task for host managed_node2 18662 1726867308.38689: ^ task is: TASK: Set flag to indicate system is ostree 18662 1726867308.38691: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867308.38695: getting variables 18662 1726867308.38697: in VariableManager get_vars() 18662 1726867308.38726: Calling all_inventory to load vars for managed_node2 18662 1726867308.38844: Calling groups_inventory to load vars for managed_node2 18662 1726867308.38850: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867308.38862: Calling all_plugins_play to load vars for managed_node2 18662 1726867308.38864: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867308.38867: Calling groups_plugins_play to load vars for managed_node2 18662 1726867308.39027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867308.39547: done with get_vars() 18662 1726867308.39558: done getting variables 18662 1726867308.39801: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 17:21:48 -0400 (0:00:00.679) 0:00:03.033 ****** 18662 1726867308.39946: entering _queue_task() for managed_node2/set_fact 18662 1726867308.39948: Creating lock for set_fact 18662 1726867308.40445: worker is 1 (out of 1 available) 18662 1726867308.40457: exiting _queue_task() for managed_node2/set_fact 18662 1726867308.40468: done queuing things up, now waiting for results queue to drain 18662 1726867308.40469: waiting for pending results... 18662 1726867308.40895: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 18662 1726867308.41042: in run() - task 0affcac9-a3a5-efab-a8ce-000000000090 18662 1726867308.41123: variable 'ansible_search_path' from source: unknown 18662 1726867308.41126: variable 'ansible_search_path' from source: unknown 18662 1726867308.41133: calling self._execute() 18662 1726867308.41216: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867308.41341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867308.41357: variable 'omit' from source: magic vars 18662 1726867308.42531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867308.42964: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867308.43006: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867308.43106: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867308.43212: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867308.43359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867308.43429: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867308.43526: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867308.43594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867308.43763: Evaluated conditional (not __network_is_ostree is defined): True 18662 1726867308.43784: variable 'omit' from source: magic vars 18662 1726867308.43885: variable 'omit' from source: magic vars 18662 1726867308.44005: variable '__ostree_booted_stat' from source: set_fact 18662 1726867308.44064: variable 'omit' from source: magic vars 18662 1726867308.44101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867308.44163: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867308.44166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867308.44175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867308.44192: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867308.44231: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867308.44239: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867308.44247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867308.44384: Set connection var ansible_timeout to 10 18662 1726867308.44390: Set connection var ansible_connection to ssh 18662 1726867308.44392: Set connection var ansible_shell_executable to /bin/sh 18662 1726867308.44394: Set connection var ansible_shell_type to sh 18662 1726867308.44396: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867308.44398: Set connection var ansible_pipelining to False 18662 1726867308.44430: variable 'ansible_shell_executable' from source: unknown 18662 1726867308.44484: variable 'ansible_connection' from source: unknown 18662 1726867308.44488: variable 'ansible_module_compression' from source: unknown 18662 1726867308.44490: variable 'ansible_shell_type' from source: unknown 18662 1726867308.44492: variable 'ansible_shell_executable' from source: unknown 18662 1726867308.44494: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867308.44496: variable 'ansible_pipelining' from source: unknown 18662 1726867308.44497: variable 'ansible_timeout' from source: unknown 18662 1726867308.44499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867308.44575: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867308.44597: variable 'omit' from source: magic vars 18662 1726867308.44608: starting attempt loop 18662 1726867308.44616: running the handler 18662 1726867308.44631: handler run complete 18662 1726867308.44701: attempt loop complete, returning result 18662 1726867308.44704: _execute() done 18662 1726867308.44706: dumping result to json 18662 1726867308.44708: done dumping result, returning 18662 1726867308.44710: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0affcac9-a3a5-efab-a8ce-000000000090] 18662 1726867308.44712: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000090 18662 1726867308.44840: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000090 18662 1726867308.44844: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 18662 1726867308.44896: no more pending results, returning what we have 18662 1726867308.44899: results queue empty 18662 1726867308.44900: checking for any_errors_fatal 18662 1726867308.44907: done checking for any_errors_fatal 18662 1726867308.44907: checking for max_fail_percentage 18662 1726867308.44911: done checking for max_fail_percentage 18662 1726867308.44912: checking to see if all hosts have failed and the running result is not ok 18662 1726867308.44913: done checking to see if all hosts have failed 18662 1726867308.44913: getting the remaining hosts for this loop 18662 1726867308.44915: done getting the remaining hosts for this loop 18662 1726867308.44918: getting the next task for host managed_node2 18662 1726867308.44926: done getting next task for host managed_node2 18662 1726867308.45078: ^ task is: TASK: Fix CentOS6 Base repo 18662 1726867308.45083: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867308.45087: getting variables 18662 1726867308.45088: in VariableManager get_vars() 18662 1726867308.45117: Calling all_inventory to load vars for managed_node2 18662 1726867308.45120: Calling groups_inventory to load vars for managed_node2 18662 1726867308.45123: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867308.45133: Calling all_plugins_play to load vars for managed_node2 18662 1726867308.45136: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867308.45144: Calling groups_plugins_play to load vars for managed_node2 18662 1726867308.45892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867308.46335: done with get_vars() 18662 1726867308.46345: done getting variables 18662 1726867308.46689: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 17:21:48 -0400 (0:00:00.068) 0:00:03.102 ****** 18662 1726867308.46719: entering _queue_task() for managed_node2/copy 18662 1726867308.47324: worker is 1 (out of 1 available) 18662 1726867308.47389: exiting _queue_task() for managed_node2/copy 18662 1726867308.47399: done queuing things up, now waiting for results queue to drain 18662 1726867308.47401: waiting for pending results... 18662 1726867308.47732: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 18662 1726867308.47910: in run() - task 0affcac9-a3a5-efab-a8ce-000000000092 18662 1726867308.47941: variable 'ansible_search_path' from source: unknown 18662 1726867308.48069: variable 'ansible_search_path' from source: unknown 18662 1726867308.48386: calling self._execute() 18662 1726867308.48390: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867308.48392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867308.48394: variable 'omit' from source: magic vars 18662 1726867308.49782: variable 'ansible_distribution' from source: facts 18662 1726867308.49810: Evaluated conditional (ansible_distribution == 'CentOS'): True 18662 1726867308.49936: variable 'ansible_distribution_major_version' from source: facts 18662 1726867308.49947: Evaluated conditional (ansible_distribution_major_version == '6'): False 18662 1726867308.49954: when evaluation is False, skipping this task 18662 1726867308.49960: _execute() done 18662 1726867308.49972: dumping result to json 18662 1726867308.49982: done dumping result, returning 18662 1726867308.49992: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [0affcac9-a3a5-efab-a8ce-000000000092] 18662 1726867308.50001: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000092 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 18662 1726867308.50171: no more pending results, returning what we have 18662 1726867308.50174: results queue empty 18662 1726867308.50175: checking for any_errors_fatal 18662 1726867308.50186: done checking for any_errors_fatal 18662 1726867308.50187: checking for max_fail_percentage 18662 1726867308.50188: done checking for max_fail_percentage 18662 1726867308.50189: checking to see if all hosts have failed and the running result is not ok 18662 1726867308.50190: done checking to see if all hosts have failed 18662 1726867308.50190: getting the remaining hosts for this loop 18662 1726867308.50192: done getting the remaining hosts for this loop 18662 1726867308.50195: getting the next task for host managed_node2 18662 1726867308.50202: done getting next task for host managed_node2 18662 1726867308.50205: ^ task is: TASK: Include the task 'enable_epel.yml' 18662 1726867308.50211: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867308.50215: getting variables 18662 1726867308.50217: in VariableManager get_vars() 18662 1726867308.50244: Calling all_inventory to load vars for managed_node2 18662 1726867308.50247: Calling groups_inventory to load vars for managed_node2 18662 1726867308.50250: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867308.50264: Calling all_plugins_play to load vars for managed_node2 18662 1726867308.50267: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867308.50270: Calling groups_plugins_play to load vars for managed_node2 18662 1726867308.50335: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000092 18662 1726867308.50338: WORKER PROCESS EXITING 18662 1726867308.50575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867308.50995: done with get_vars() 18662 1726867308.51010: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 17:21:48 -0400 (0:00:00.043) 0:00:03.146 ****** 18662 1726867308.51106: entering _queue_task() for managed_node2/include_tasks 18662 1726867308.51542: worker is 1 (out of 1 available) 18662 1726867308.51805: exiting _queue_task() for managed_node2/include_tasks 18662 1726867308.51817: done queuing things up, now waiting for results queue to drain 18662 1726867308.51818: waiting for pending results... 18662 1726867308.52107: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 18662 1726867308.52170: in run() - task 0affcac9-a3a5-efab-a8ce-000000000093 18662 1726867308.52192: variable 'ansible_search_path' from source: unknown 18662 1726867308.52263: variable 'ansible_search_path' from source: unknown 18662 1726867308.52267: calling self._execute() 18662 1726867308.52327: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867308.52338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867308.52350: variable 'omit' from source: magic vars 18662 1726867308.52986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867308.55927: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867308.56002: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867308.56061: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867308.56127: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867308.56192: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867308.56375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867308.56658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867308.56661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867308.56664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867308.56666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867308.56743: variable '__network_is_ostree' from source: set_fact 18662 1726867308.56773: Evaluated conditional (not __network_is_ostree | d(false)): True 18662 1726867308.56787: _execute() done 18662 1726867308.56796: dumping result to json 18662 1726867308.56805: done dumping result, returning 18662 1726867308.56835: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [0affcac9-a3a5-efab-a8ce-000000000093] 18662 1726867308.56846: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000093 18662 1726867308.56969: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000093 18662 1726867308.57004: no more pending results, returning what we have 18662 1726867308.57012: in VariableManager get_vars() 18662 1726867308.57047: Calling all_inventory to load vars for managed_node2 18662 1726867308.57049: Calling groups_inventory to load vars for managed_node2 18662 1726867308.57053: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867308.57063: Calling all_plugins_play to load vars for managed_node2 18662 1726867308.57066: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867308.57069: Calling groups_plugins_play to load vars for managed_node2 18662 1726867308.57426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867308.57828: done with get_vars() 18662 1726867308.57841: variable 'ansible_search_path' from source: unknown 18662 1726867308.57843: variable 'ansible_search_path' from source: unknown 18662 1726867308.57857: WORKER PROCESS EXITING 18662 1726867308.57885: we have included files to process 18662 1726867308.57887: generating all_blocks data 18662 1726867308.57888: done generating all_blocks data 18662 1726867308.57926: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18662 1726867308.57928: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18662 1726867308.57932: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 18662 1726867308.58693: done processing included file 18662 1726867308.58695: iterating over new_blocks loaded from include file 18662 1726867308.58697: in VariableManager get_vars() 18662 1726867308.58714: done with get_vars() 18662 1726867308.58715: filtering new block on tags 18662 1726867308.58737: done filtering new block on tags 18662 1726867308.58740: in VariableManager get_vars() 18662 1726867308.58749: done with get_vars() 18662 1726867308.58750: filtering new block on tags 18662 1726867308.58759: done filtering new block on tags 18662 1726867308.58761: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 18662 1726867308.58766: extending task lists for all hosts with included blocks 18662 1726867308.58862: done extending task lists 18662 1726867308.58863: done processing included files 18662 1726867308.58864: results queue empty 18662 1726867308.58864: checking for any_errors_fatal 18662 1726867308.58868: done checking for any_errors_fatal 18662 1726867308.58869: checking for max_fail_percentage 18662 1726867308.58870: done checking for max_fail_percentage 18662 1726867308.58870: checking to see if all hosts have failed and the running result is not ok 18662 1726867308.58871: done checking to see if all hosts have failed 18662 1726867308.58872: getting the remaining hosts for this loop 18662 1726867308.58873: done getting the remaining hosts for this loop 18662 1726867308.58874: getting the next task for host managed_node2 18662 1726867308.58879: done getting next task for host managed_node2 18662 1726867308.58881: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 18662 1726867308.58884: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867308.58886: getting variables 18662 1726867308.58886: in VariableManager get_vars() 18662 1726867308.58893: Calling all_inventory to load vars for managed_node2 18662 1726867308.58895: Calling groups_inventory to load vars for managed_node2 18662 1726867308.58897: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867308.58902: Calling all_plugins_play to load vars for managed_node2 18662 1726867308.58910: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867308.58913: Calling groups_plugins_play to load vars for managed_node2 18662 1726867308.59058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867308.59253: done with get_vars() 18662 1726867308.59262: done getting variables 18662 1726867308.59330: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 18662 1726867308.59531: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 17:21:48 -0400 (0:00:00.084) 0:00:03.231 ****** 18662 1726867308.59580: entering _queue_task() for managed_node2/command 18662 1726867308.59583: Creating lock for command 18662 1726867308.59848: worker is 1 (out of 1 available) 18662 1726867308.59860: exiting _queue_task() for managed_node2/command 18662 1726867308.59871: done queuing things up, now waiting for results queue to drain 18662 1726867308.59873: waiting for pending results... 18662 1726867308.60181: running TaskExecutor() for managed_node2/TASK: Create EPEL 10 18662 1726867308.60676: in run() - task 0affcac9-a3a5-efab-a8ce-0000000000ad 18662 1726867308.60682: variable 'ansible_search_path' from source: unknown 18662 1726867308.60685: variable 'ansible_search_path' from source: unknown 18662 1726867308.60688: calling self._execute() 18662 1726867308.60801: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867308.60817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867308.60902: variable 'omit' from source: magic vars 18662 1726867308.61574: variable 'ansible_distribution' from source: facts 18662 1726867308.61884: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18662 1726867308.61927: variable 'ansible_distribution_major_version' from source: facts 18662 1726867308.61939: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18662 1726867308.61947: when evaluation is False, skipping this task 18662 1726867308.61954: _execute() done 18662 1726867308.61962: dumping result to json 18662 1726867308.61969: done dumping result, returning 18662 1726867308.61983: done running TaskExecutor() for managed_node2/TASK: Create EPEL 10 [0affcac9-a3a5-efab-a8ce-0000000000ad] 18662 1726867308.61997: sending task result for task 0affcac9-a3a5-efab-a8ce-0000000000ad skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18662 1726867308.62159: no more pending results, returning what we have 18662 1726867308.62163: results queue empty 18662 1726867308.62163: checking for any_errors_fatal 18662 1726867308.62165: done checking for any_errors_fatal 18662 1726867308.62166: checking for max_fail_percentage 18662 1726867308.62167: done checking for max_fail_percentage 18662 1726867308.62168: checking to see if all hosts have failed and the running result is not ok 18662 1726867308.62168: done checking to see if all hosts have failed 18662 1726867308.62169: getting the remaining hosts for this loop 18662 1726867308.62170: done getting the remaining hosts for this loop 18662 1726867308.62173: getting the next task for host managed_node2 18662 1726867308.62188: done getting next task for host managed_node2 18662 1726867308.62191: ^ task is: TASK: Install yum-utils package 18662 1726867308.62195: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867308.62199: getting variables 18662 1726867308.62200: in VariableManager get_vars() 18662 1726867308.62229: Calling all_inventory to load vars for managed_node2 18662 1726867308.62231: Calling groups_inventory to load vars for managed_node2 18662 1726867308.62234: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867308.62246: Calling all_plugins_play to load vars for managed_node2 18662 1726867308.62250: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867308.62253: Calling groups_plugins_play to load vars for managed_node2 18662 1726867308.62621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867308.63127: done with get_vars() 18662 1726867308.63140: done getting variables 18662 1726867308.63172: done sending task result for task 0affcac9-a3a5-efab-a8ce-0000000000ad 18662 1726867308.63176: WORKER PROCESS EXITING 18662 1726867308.63252: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 17:21:48 -0400 (0:00:00.037) 0:00:03.268 ****** 18662 1726867308.63281: entering _queue_task() for managed_node2/package 18662 1726867308.63283: Creating lock for package 18662 1726867308.63513: worker is 1 (out of 1 available) 18662 1726867308.63524: exiting _queue_task() for managed_node2/package 18662 1726867308.63535: done queuing things up, now waiting for results queue to drain 18662 1726867308.63537: waiting for pending results... 18662 1726867308.63716: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 18662 1726867308.63831: in run() - task 0affcac9-a3a5-efab-a8ce-0000000000ae 18662 1726867308.63848: variable 'ansible_search_path' from source: unknown 18662 1726867308.63856: variable 'ansible_search_path' from source: unknown 18662 1726867308.63896: calling self._execute() 18662 1726867308.64048: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867308.64064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867308.64082: variable 'omit' from source: magic vars 18662 1726867308.64429: variable 'ansible_distribution' from source: facts 18662 1726867308.64482: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18662 1726867308.64659: variable 'ansible_distribution_major_version' from source: facts 18662 1726867308.64699: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18662 1726867308.64708: when evaluation is False, skipping this task 18662 1726867308.64716: _execute() done 18662 1726867308.65082: dumping result to json 18662 1726867308.65086: done dumping result, returning 18662 1726867308.65088: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [0affcac9-a3a5-efab-a8ce-0000000000ae] 18662 1726867308.65090: sending task result for task 0affcac9-a3a5-efab-a8ce-0000000000ae 18662 1726867308.65155: done sending task result for task 0affcac9-a3a5-efab-a8ce-0000000000ae 18662 1726867308.65159: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18662 1726867308.65252: no more pending results, returning what we have 18662 1726867308.65255: results queue empty 18662 1726867308.65256: checking for any_errors_fatal 18662 1726867308.65259: done checking for any_errors_fatal 18662 1726867308.65259: checking for max_fail_percentage 18662 1726867308.65261: done checking for max_fail_percentage 18662 1726867308.65261: checking to see if all hosts have failed and the running result is not ok 18662 1726867308.65262: done checking to see if all hosts have failed 18662 1726867308.65263: getting the remaining hosts for this loop 18662 1726867308.65264: done getting the remaining hosts for this loop 18662 1726867308.65266: getting the next task for host managed_node2 18662 1726867308.65271: done getting next task for host managed_node2 18662 1726867308.65273: ^ task is: TASK: Enable EPEL 7 18662 1726867308.65276: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867308.65280: getting variables 18662 1726867308.65281: in VariableManager get_vars() 18662 1726867308.65298: Calling all_inventory to load vars for managed_node2 18662 1726867308.65300: Calling groups_inventory to load vars for managed_node2 18662 1726867308.65303: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867308.65313: Calling all_plugins_play to load vars for managed_node2 18662 1726867308.65316: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867308.65319: Calling groups_plugins_play to load vars for managed_node2 18662 1726867308.65681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867308.65964: done with get_vars() 18662 1726867308.65972: done getting variables 18662 1726867308.66026: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 17:21:48 -0400 (0:00:00.027) 0:00:03.296 ****** 18662 1726867308.66061: entering _queue_task() for managed_node2/command 18662 1726867308.66392: worker is 1 (out of 1 available) 18662 1726867308.66400: exiting _queue_task() for managed_node2/command 18662 1726867308.66409: done queuing things up, now waiting for results queue to drain 18662 1726867308.66411: waiting for pending results... 18662 1726867308.66714: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 18662 1726867308.66719: in run() - task 0affcac9-a3a5-efab-a8ce-0000000000af 18662 1726867308.66724: variable 'ansible_search_path' from source: unknown 18662 1726867308.66727: variable 'ansible_search_path' from source: unknown 18662 1726867308.66740: calling self._execute() 18662 1726867308.66825: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867308.66836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867308.66848: variable 'omit' from source: magic vars 18662 1726867308.67224: variable 'ansible_distribution' from source: facts 18662 1726867308.67251: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18662 1726867308.67469: variable 'ansible_distribution_major_version' from source: facts 18662 1726867308.67472: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18662 1726867308.67475: when evaluation is False, skipping this task 18662 1726867308.67479: _execute() done 18662 1726867308.67481: dumping result to json 18662 1726867308.67483: done dumping result, returning 18662 1726867308.67486: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [0affcac9-a3a5-efab-a8ce-0000000000af] 18662 1726867308.67488: sending task result for task 0affcac9-a3a5-efab-a8ce-0000000000af 18662 1726867308.67545: done sending task result for task 0affcac9-a3a5-efab-a8ce-0000000000af 18662 1726867308.67548: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18662 1726867308.67627: no more pending results, returning what we have 18662 1726867308.67630: results queue empty 18662 1726867308.67631: checking for any_errors_fatal 18662 1726867308.67638: done checking for any_errors_fatal 18662 1726867308.67638: checking for max_fail_percentage 18662 1726867308.67640: done checking for max_fail_percentage 18662 1726867308.67640: checking to see if all hosts have failed and the running result is not ok 18662 1726867308.67641: done checking to see if all hosts have failed 18662 1726867308.67642: getting the remaining hosts for this loop 18662 1726867308.67643: done getting the remaining hosts for this loop 18662 1726867308.67646: getting the next task for host managed_node2 18662 1726867308.67652: done getting next task for host managed_node2 18662 1726867308.67655: ^ task is: TASK: Enable EPEL 8 18662 1726867308.67659: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867308.67663: getting variables 18662 1726867308.67664: in VariableManager get_vars() 18662 1726867308.67695: Calling all_inventory to load vars for managed_node2 18662 1726867308.67698: Calling groups_inventory to load vars for managed_node2 18662 1726867308.67701: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867308.67712: Calling all_plugins_play to load vars for managed_node2 18662 1726867308.67715: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867308.67718: Calling groups_plugins_play to load vars for managed_node2 18662 1726867308.68120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867308.68367: done with get_vars() 18662 1726867308.68376: done getting variables 18662 1726867308.68442: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 17:21:48 -0400 (0:00:00.024) 0:00:03.320 ****** 18662 1726867308.68472: entering _queue_task() for managed_node2/command 18662 1726867308.68713: worker is 1 (out of 1 available) 18662 1726867308.68723: exiting _queue_task() for managed_node2/command 18662 1726867308.68734: done queuing things up, now waiting for results queue to drain 18662 1726867308.68736: waiting for pending results... 18662 1726867308.69094: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 18662 1726867308.69114: in run() - task 0affcac9-a3a5-efab-a8ce-0000000000b0 18662 1726867308.69132: variable 'ansible_search_path' from source: unknown 18662 1726867308.69140: variable 'ansible_search_path' from source: unknown 18662 1726867308.69182: calling self._execute() 18662 1726867308.69259: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867308.69271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867308.69291: variable 'omit' from source: magic vars 18662 1726867308.69691: variable 'ansible_distribution' from source: facts 18662 1726867308.69706: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18662 1726867308.69846: variable 'ansible_distribution_major_version' from source: facts 18662 1726867308.69857: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 18662 1726867308.69864: when evaluation is False, skipping this task 18662 1726867308.69883: _execute() done 18662 1726867308.69886: dumping result to json 18662 1726867308.69888: done dumping result, returning 18662 1726867308.69952: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [0affcac9-a3a5-efab-a8ce-0000000000b0] 18662 1726867308.69955: sending task result for task 0affcac9-a3a5-efab-a8ce-0000000000b0 18662 1726867308.70015: done sending task result for task 0affcac9-a3a5-efab-a8ce-0000000000b0 18662 1726867308.70018: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 18662 1726867308.70104: no more pending results, returning what we have 18662 1726867308.70108: results queue empty 18662 1726867308.70109: checking for any_errors_fatal 18662 1726867308.70113: done checking for any_errors_fatal 18662 1726867308.70114: checking for max_fail_percentage 18662 1726867308.70116: done checking for max_fail_percentage 18662 1726867308.70117: checking to see if all hosts have failed and the running result is not ok 18662 1726867308.70117: done checking to see if all hosts have failed 18662 1726867308.70118: getting the remaining hosts for this loop 18662 1726867308.70119: done getting the remaining hosts for this loop 18662 1726867308.70122: getting the next task for host managed_node2 18662 1726867308.70131: done getting next task for host managed_node2 18662 1726867308.70134: ^ task is: TASK: Enable EPEL 6 18662 1726867308.70138: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867308.70140: getting variables 18662 1726867308.70142: in VariableManager get_vars() 18662 1726867308.70309: Calling all_inventory to load vars for managed_node2 18662 1726867308.70312: Calling groups_inventory to load vars for managed_node2 18662 1726867308.70315: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867308.70324: Calling all_plugins_play to load vars for managed_node2 18662 1726867308.70326: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867308.70329: Calling groups_plugins_play to load vars for managed_node2 18662 1726867308.70500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867308.70699: done with get_vars() 18662 1726867308.70708: done getting variables 18662 1726867308.70766: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 17:21:48 -0400 (0:00:00.023) 0:00:03.343 ****** 18662 1726867308.70796: entering _queue_task() for managed_node2/copy 18662 1726867308.71608: worker is 1 (out of 1 available) 18662 1726867308.71614: exiting _queue_task() for managed_node2/copy 18662 1726867308.71623: done queuing things up, now waiting for results queue to drain 18662 1726867308.71624: waiting for pending results... 18662 1726867308.71848: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 18662 1726867308.71907: in run() - task 0affcac9-a3a5-efab-a8ce-0000000000b2 18662 1726867308.71973: variable 'ansible_search_path' from source: unknown 18662 1726867308.71984: variable 'ansible_search_path' from source: unknown 18662 1726867308.72051: calling self._execute() 18662 1726867308.72286: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867308.72290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867308.72292: variable 'omit' from source: magic vars 18662 1726867308.72852: variable 'ansible_distribution' from source: facts 18662 1726867308.72869: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 18662 1726867308.72996: variable 'ansible_distribution_major_version' from source: facts 18662 1726867308.73006: Evaluated conditional (ansible_distribution_major_version == '6'): False 18662 1726867308.73013: when evaluation is False, skipping this task 18662 1726867308.73019: _execute() done 18662 1726867308.73025: dumping result to json 18662 1726867308.73031: done dumping result, returning 18662 1726867308.73083: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [0affcac9-a3a5-efab-a8ce-0000000000b2] 18662 1726867308.73086: sending task result for task 0affcac9-a3a5-efab-a8ce-0000000000b2 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 18662 1726867308.73257: no more pending results, returning what we have 18662 1726867308.73260: results queue empty 18662 1726867308.73261: checking for any_errors_fatal 18662 1726867308.73266: done checking for any_errors_fatal 18662 1726867308.73267: checking for max_fail_percentage 18662 1726867308.73268: done checking for max_fail_percentage 18662 1726867308.73269: checking to see if all hosts have failed and the running result is not ok 18662 1726867308.73270: done checking to see if all hosts have failed 18662 1726867308.73270: getting the remaining hosts for this loop 18662 1726867308.73271: done getting the remaining hosts for this loop 18662 1726867308.73275: getting the next task for host managed_node2 18662 1726867308.73285: done getting next task for host managed_node2 18662 1726867308.73287: ^ task is: TASK: Set network provider to 'nm' 18662 1726867308.73290: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867308.73293: getting variables 18662 1726867308.73295: in VariableManager get_vars() 18662 1726867308.73323: Calling all_inventory to load vars for managed_node2 18662 1726867308.73326: Calling groups_inventory to load vars for managed_node2 18662 1726867308.73329: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867308.73342: Calling all_plugins_play to load vars for managed_node2 18662 1726867308.73345: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867308.73348: Calling groups_plugins_play to load vars for managed_node2 18662 1726867308.73853: done sending task result for task 0affcac9-a3a5-efab-a8ce-0000000000b2 18662 1726867308.73856: WORKER PROCESS EXITING 18662 1726867308.73880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867308.74058: done with get_vars() 18662 1726867308.74065: done getting variables 18662 1726867308.74113: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:13 Friday 20 September 2024 17:21:48 -0400 (0:00:00.033) 0:00:03.376 ****** 18662 1726867308.74133: entering _queue_task() for managed_node2/set_fact 18662 1726867308.74354: worker is 1 (out of 1 available) 18662 1726867308.74484: exiting _queue_task() for managed_node2/set_fact 18662 1726867308.74495: done queuing things up, now waiting for results queue to drain 18662 1726867308.74496: waiting for pending results... 18662 1726867308.74655: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 18662 1726867308.74984: in run() - task 0affcac9-a3a5-efab-a8ce-000000000007 18662 1726867308.74988: variable 'ansible_search_path' from source: unknown 18662 1726867308.75146: calling self._execute() 18662 1726867308.75383: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867308.75386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867308.75389: variable 'omit' from source: magic vars 18662 1726867308.75762: variable 'omit' from source: magic vars 18662 1726867308.75982: variable 'omit' from source: magic vars 18662 1726867308.75986: variable 'omit' from source: magic vars 18662 1726867308.76264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867308.76312: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867308.76416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867308.76440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867308.76470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867308.76680: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867308.76684: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867308.76687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867308.76820: Set connection var ansible_timeout to 10 18662 1726867308.76829: Set connection var ansible_connection to ssh 18662 1726867308.76841: Set connection var ansible_shell_executable to /bin/sh 18662 1726867308.76848: Set connection var ansible_shell_type to sh 18662 1726867308.76864: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867308.76873: Set connection var ansible_pipelining to False 18662 1726867308.76922: variable 'ansible_shell_executable' from source: unknown 18662 1726867308.77012: variable 'ansible_connection' from source: unknown 18662 1726867308.77020: variable 'ansible_module_compression' from source: unknown 18662 1726867308.77026: variable 'ansible_shell_type' from source: unknown 18662 1726867308.77032: variable 'ansible_shell_executable' from source: unknown 18662 1726867308.77039: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867308.77048: variable 'ansible_pipelining' from source: unknown 18662 1726867308.77056: variable 'ansible_timeout' from source: unknown 18662 1726867308.77063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867308.77359: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867308.77363: variable 'omit' from source: magic vars 18662 1726867308.77365: starting attempt loop 18662 1726867308.77367: running the handler 18662 1726867308.77370: handler run complete 18662 1726867308.77372: attempt loop complete, returning result 18662 1726867308.77374: _execute() done 18662 1726867308.77375: dumping result to json 18662 1726867308.77379: done dumping result, returning 18662 1726867308.77452: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [0affcac9-a3a5-efab-a8ce-000000000007] 18662 1726867308.77463: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000007 ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 18662 1726867308.77616: no more pending results, returning what we have 18662 1726867308.77620: results queue empty 18662 1726867308.77621: checking for any_errors_fatal 18662 1726867308.77630: done checking for any_errors_fatal 18662 1726867308.77631: checking for max_fail_percentage 18662 1726867308.77633: done checking for max_fail_percentage 18662 1726867308.77634: checking to see if all hosts have failed and the running result is not ok 18662 1726867308.77635: done checking to see if all hosts have failed 18662 1726867308.77635: getting the remaining hosts for this loop 18662 1726867308.77637: done getting the remaining hosts for this loop 18662 1726867308.77641: getting the next task for host managed_node2 18662 1726867308.77649: done getting next task for host managed_node2 18662 1726867308.77656: ^ task is: TASK: meta (flush_handlers) 18662 1726867308.77658: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867308.77663: getting variables 18662 1726867308.77665: in VariableManager get_vars() 18662 1726867308.77698: Calling all_inventory to load vars for managed_node2 18662 1726867308.77700: Calling groups_inventory to load vars for managed_node2 18662 1726867308.77704: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867308.77715: Calling all_plugins_play to load vars for managed_node2 18662 1726867308.77718: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867308.77721: Calling groups_plugins_play to load vars for managed_node2 18662 1726867308.78108: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000007 18662 1726867308.78111: WORKER PROCESS EXITING 18662 1726867308.78132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867308.78366: done with get_vars() 18662 1726867308.78376: done getting variables 18662 1726867308.78454: in VariableManager get_vars() 18662 1726867308.78463: Calling all_inventory to load vars for managed_node2 18662 1726867308.78465: Calling groups_inventory to load vars for managed_node2 18662 1726867308.78467: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867308.78472: Calling all_plugins_play to load vars for managed_node2 18662 1726867308.78474: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867308.78479: Calling groups_plugins_play to load vars for managed_node2 18662 1726867308.78629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867308.78851: done with get_vars() 18662 1726867308.78866: done queuing things up, now waiting for results queue to drain 18662 1726867308.78868: results queue empty 18662 1726867308.78869: checking for any_errors_fatal 18662 1726867308.78871: done checking for any_errors_fatal 18662 1726867308.78872: checking for max_fail_percentage 18662 1726867308.78873: done checking for max_fail_percentage 18662 1726867308.78874: checking to see if all hosts have failed and the running result is not ok 18662 1726867308.78874: done checking to see if all hosts have failed 18662 1726867308.78875: getting the remaining hosts for this loop 18662 1726867308.78876: done getting the remaining hosts for this loop 18662 1726867308.78880: getting the next task for host managed_node2 18662 1726867308.78884: done getting next task for host managed_node2 18662 1726867308.78886: ^ task is: TASK: meta (flush_handlers) 18662 1726867308.78887: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867308.78894: getting variables 18662 1726867308.78895: in VariableManager get_vars() 18662 1726867308.78903: Calling all_inventory to load vars for managed_node2 18662 1726867308.78905: Calling groups_inventory to load vars for managed_node2 18662 1726867308.78907: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867308.78911: Calling all_plugins_play to load vars for managed_node2 18662 1726867308.78913: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867308.78916: Calling groups_plugins_play to load vars for managed_node2 18662 1726867308.79053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867308.79415: done with get_vars() 18662 1726867308.79423: done getting variables 18662 1726867308.79463: in VariableManager get_vars() 18662 1726867308.79471: Calling all_inventory to load vars for managed_node2 18662 1726867308.79473: Calling groups_inventory to load vars for managed_node2 18662 1726867308.79475: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867308.79612: Calling all_plugins_play to load vars for managed_node2 18662 1726867308.79615: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867308.79618: Calling groups_plugins_play to load vars for managed_node2 18662 1726867308.79832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867308.80229: done with get_vars() 18662 1726867308.80240: done queuing things up, now waiting for results queue to drain 18662 1726867308.80242: results queue empty 18662 1726867308.80242: checking for any_errors_fatal 18662 1726867308.80243: done checking for any_errors_fatal 18662 1726867308.80244: checking for max_fail_percentage 18662 1726867308.80245: done checking for max_fail_percentage 18662 1726867308.80245: checking to see if all hosts have failed and the running result is not ok 18662 1726867308.80246: done checking to see if all hosts have failed 18662 1726867308.80246: getting the remaining hosts for this loop 18662 1726867308.80247: done getting the remaining hosts for this loop 18662 1726867308.80249: getting the next task for host managed_node2 18662 1726867308.80251: done getting next task for host managed_node2 18662 1726867308.80252: ^ task is: None 18662 1726867308.80253: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867308.80254: done queuing things up, now waiting for results queue to drain 18662 1726867308.80255: results queue empty 18662 1726867308.80255: checking for any_errors_fatal 18662 1726867308.80354: done checking for any_errors_fatal 18662 1726867308.80355: checking for max_fail_percentage 18662 1726867308.80356: done checking for max_fail_percentage 18662 1726867308.80357: checking to see if all hosts have failed and the running result is not ok 18662 1726867308.80358: done checking to see if all hosts have failed 18662 1726867308.80360: getting the next task for host managed_node2 18662 1726867308.80362: done getting next task for host managed_node2 18662 1726867308.80363: ^ task is: None 18662 1726867308.80364: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867308.80426: in VariableManager get_vars() 18662 1726867308.80439: done with get_vars() 18662 1726867308.80444: in VariableManager get_vars() 18662 1726867308.80451: done with get_vars() 18662 1726867308.80455: variable 'omit' from source: magic vars 18662 1726867308.80514: in VariableManager get_vars() 18662 1726867308.80547: done with get_vars() 18662 1726867308.80570: variable 'omit' from source: magic vars PLAY [Play for showing the network provider] *********************************** 18662 1726867308.80785: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18662 1726867308.80817: getting the remaining hosts for this loop 18662 1726867308.80819: done getting the remaining hosts for this loop 18662 1726867308.80821: getting the next task for host managed_node2 18662 1726867308.80824: done getting next task for host managed_node2 18662 1726867308.80826: ^ task is: TASK: Gathering Facts 18662 1726867308.80827: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867308.80829: getting variables 18662 1726867308.80830: in VariableManager get_vars() 18662 1726867308.80838: Calling all_inventory to load vars for managed_node2 18662 1726867308.80840: Calling groups_inventory to load vars for managed_node2 18662 1726867308.80842: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867308.80846: Calling all_plugins_play to load vars for managed_node2 18662 1726867308.80861: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867308.80864: Calling groups_plugins_play to load vars for managed_node2 18662 1726867308.81006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867308.81195: done with get_vars() 18662 1726867308.81202: done getting variables 18662 1726867308.81246: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Friday 20 September 2024 17:21:48 -0400 (0:00:00.071) 0:00:03.448 ****** 18662 1726867308.81270: entering _queue_task() for managed_node2/gather_facts 18662 1726867308.81534: worker is 1 (out of 1 available) 18662 1726867308.81544: exiting _queue_task() for managed_node2/gather_facts 18662 1726867308.81555: done queuing things up, now waiting for results queue to drain 18662 1726867308.81557: waiting for pending results... 18662 1726867308.81796: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18662 1726867308.81883: in run() - task 0affcac9-a3a5-efab-a8ce-0000000000d8 18662 1726867308.81915: variable 'ansible_search_path' from source: unknown 18662 1726867308.81956: calling self._execute() 18662 1726867308.82042: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867308.82055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867308.82071: variable 'omit' from source: magic vars 18662 1726867308.82516: variable 'ansible_distribution_major_version' from source: facts 18662 1726867308.82533: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867308.82582: variable 'omit' from source: magic vars 18662 1726867308.82586: variable 'omit' from source: magic vars 18662 1726867308.82625: variable 'omit' from source: magic vars 18662 1726867308.82701: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867308.82716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867308.82741: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867308.82770: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867308.82789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867308.82826: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867308.82876: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867308.82881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867308.82955: Set connection var ansible_timeout to 10 18662 1726867308.82963: Set connection var ansible_connection to ssh 18662 1726867308.82973: Set connection var ansible_shell_executable to /bin/sh 18662 1726867308.82989: Set connection var ansible_shell_type to sh 18662 1726867308.83004: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867308.83012: Set connection var ansible_pipelining to False 18662 1726867308.83084: variable 'ansible_shell_executable' from source: unknown 18662 1726867308.83094: variable 'ansible_connection' from source: unknown 18662 1726867308.83097: variable 'ansible_module_compression' from source: unknown 18662 1726867308.83100: variable 'ansible_shell_type' from source: unknown 18662 1726867308.83102: variable 'ansible_shell_executable' from source: unknown 18662 1726867308.83104: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867308.83106: variable 'ansible_pipelining' from source: unknown 18662 1726867308.83108: variable 'ansible_timeout' from source: unknown 18662 1726867308.83109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867308.83276: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867308.83294: variable 'omit' from source: magic vars 18662 1726867308.83312: starting attempt loop 18662 1726867308.83422: running the handler 18662 1726867308.83426: variable 'ansible_facts' from source: unknown 18662 1726867308.83428: _low_level_execute_command(): starting 18662 1726867308.83430: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867308.84112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867308.84193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867308.84216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867308.84235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867308.84249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867308.84324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18662 1726867308.87088: stdout chunk (state=3): >>>/root <<< 18662 1726867308.87091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867308.87093: stdout chunk (state=3): >>><<< 18662 1726867308.87096: stderr chunk (state=3): >>><<< 18662 1726867308.87098: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18662 1726867308.87101: _low_level_execute_command(): starting 18662 1726867308.87104: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867308.8699617-18814-11464055502742 `" && echo ansible-tmp-1726867308.8699617-18814-11464055502742="` echo /root/.ansible/tmp/ansible-tmp-1726867308.8699617-18814-11464055502742 `" ) && sleep 0' 18662 1726867308.88126: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867308.88159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867308.88404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867308.88456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867308.90444: stdout chunk (state=3): >>>ansible-tmp-1726867308.8699617-18814-11464055502742=/root/.ansible/tmp/ansible-tmp-1726867308.8699617-18814-11464055502742 <<< 18662 1726867308.90547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867308.90598: stderr chunk (state=3): >>><<< 18662 1726867308.90601: stdout chunk (state=3): >>><<< 18662 1726867308.90650: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867308.8699617-18814-11464055502742=/root/.ansible/tmp/ansible-tmp-1726867308.8699617-18814-11464055502742 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867308.90738: variable 'ansible_module_compression' from source: unknown 18662 1726867308.90816: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18662 1726867308.91159: variable 'ansible_facts' from source: unknown 18662 1726867308.91357: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867308.8699617-18814-11464055502742/AnsiballZ_setup.py 18662 1726867308.91897: Sending initial data 18662 1726867308.91907: Sent initial data (153 bytes) 18662 1726867308.92829: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867308.92842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867308.92854: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867308.93195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867308.93198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867308.93227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867308.93310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867308.95012: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867308.95111: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867308.95128: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpne39ek7v /root/.ansible/tmp/ansible-tmp-1726867308.8699617-18814-11464055502742/AnsiballZ_setup.py <<< 18662 1726867308.95138: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867308.8699617-18814-11464055502742/AnsiballZ_setup.py" <<< 18662 1726867308.95201: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpne39ek7v" to remote "/root/.ansible/tmp/ansible-tmp-1726867308.8699617-18814-11464055502742/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867308.8699617-18814-11464055502742/AnsiballZ_setup.py" <<< 18662 1726867308.97716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867308.97984: stderr chunk (state=3): >>><<< 18662 1726867308.97988: stdout chunk (state=3): >>><<< 18662 1726867308.97990: done transferring module to remote 18662 1726867308.97992: _low_level_execute_command(): starting 18662 1726867308.97995: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867308.8699617-18814-11464055502742/ /root/.ansible/tmp/ansible-tmp-1726867308.8699617-18814-11464055502742/AnsiballZ_setup.py && sleep 0' 18662 1726867308.99274: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867308.99279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867308.99283: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867308.99285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867308.99287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 18662 1726867308.99290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867308.99364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867308.99427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867308.99439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867308.99544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867309.01585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867309.01590: stdout chunk (state=3): >>><<< 18662 1726867309.01592: stderr chunk (state=3): >>><<< 18662 1726867309.01597: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867309.01600: _low_level_execute_command(): starting 18662 1726867309.01602: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867308.8699617-18814-11464055502742/AnsiballZ_setup.py && sleep 0' 18662 1726867309.02351: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867309.02499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867309.02513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867309.02527: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867309.02543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867309.02633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867309.80092: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.4140625, "5m": 0.38427734375, "15m": 0.20068359375}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2955, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 576, "free": 2955}, "nocache": {"free": 3293, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 547, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794484224, "block_size": 4096, "block_total": 65519099, "block_available": 63914669, "block_used": 1604430, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "49", "epoch": "1726867309", "epoch_int": "1726867309", "date": "2024-09-20", "time": "17:21:49", "iso8601_micro": "2024-09-20T21:21:49.741628Z", "iso8601": "2024-09-20T21:21:49Z", "iso8601_basic": "20240920T172149741628", "iso8601_basic_short": "20240920T172149", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "<<< 18662 1726867309.80126: stdout chunk (state=3): >>>off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18662 1726867309.82805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867309.82833: stdout chunk (state=3): >>><<< 18662 1726867309.82849: stderr chunk (state=3): >>><<< 18662 1726867309.82895: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.4140625, "5m": 0.38427734375, "15m": 0.20068359375}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2955, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 576, "free": 2955}, "nocache": {"free": 3293, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 547, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794484224, "block_size": 4096, "block_total": 65519099, "block_available": 63914669, "block_used": 1604430, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "49", "epoch": "1726867309", "epoch_int": "1726867309", "date": "2024-09-20", "time": "17:21:49", "iso8601_micro": "2024-09-20T21:21:49.741628Z", "iso8601": "2024-09-20T21:21:49Z", "iso8601_basic": "20240920T172149741628", "iso8601_basic_short": "20240920T172149", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867309.83351: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867308.8699617-18814-11464055502742/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867309.83375: _low_level_execute_command(): starting 18662 1726867309.83400: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867308.8699617-18814-11464055502742/ > /dev/null 2>&1 && sleep 0' 18662 1726867309.84061: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867309.84101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867309.84121: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867309.84170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867309.84174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867309.84253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867309.86892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867309.86916: stderr chunk (state=3): >>><<< 18662 1726867309.86983: stdout chunk (state=3): >>><<< 18662 1726867309.86987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867309.86992: handler run complete 18662 1726867309.87105: variable 'ansible_facts' from source: unknown 18662 1726867309.87288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867309.87489: variable 'ansible_facts' from source: unknown 18662 1726867309.87555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867309.87638: attempt loop complete, returning result 18662 1726867309.87641: _execute() done 18662 1726867309.87643: dumping result to json 18662 1726867309.87668: done dumping result, returning 18662 1726867309.87675: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-efab-a8ce-0000000000d8] 18662 1726867309.87682: sending task result for task 0affcac9-a3a5-efab-a8ce-0000000000d8 18662 1726867309.87973: done sending task result for task 0affcac9-a3a5-efab-a8ce-0000000000d8 18662 1726867309.87976: WORKER PROCESS EXITING ok: [managed_node2] 18662 1726867309.88164: no more pending results, returning what we have 18662 1726867309.88166: results queue empty 18662 1726867309.88167: checking for any_errors_fatal 18662 1726867309.88168: done checking for any_errors_fatal 18662 1726867309.88168: checking for max_fail_percentage 18662 1726867309.88169: done checking for max_fail_percentage 18662 1726867309.88169: checking to see if all hosts have failed and the running result is not ok 18662 1726867309.88170: done checking to see if all hosts have failed 18662 1726867309.88170: getting the remaining hosts for this loop 18662 1726867309.88171: done getting the remaining hosts for this loop 18662 1726867309.88174: getting the next task for host managed_node2 18662 1726867309.88188: done getting next task for host managed_node2 18662 1726867309.88190: ^ task is: TASK: meta (flush_handlers) 18662 1726867309.88192: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867309.88196: getting variables 18662 1726867309.88197: in VariableManager get_vars() 18662 1726867309.88221: Calling all_inventory to load vars for managed_node2 18662 1726867309.88224: Calling groups_inventory to load vars for managed_node2 18662 1726867309.88227: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867309.88237: Calling all_plugins_play to load vars for managed_node2 18662 1726867309.88239: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867309.88242: Calling groups_plugins_play to load vars for managed_node2 18662 1726867309.88430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867309.88643: done with get_vars() 18662 1726867309.88653: done getting variables 18662 1726867309.88723: in VariableManager get_vars() 18662 1726867309.88732: Calling all_inventory to load vars for managed_node2 18662 1726867309.88734: Calling groups_inventory to load vars for managed_node2 18662 1726867309.88736: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867309.88749: Calling all_plugins_play to load vars for managed_node2 18662 1726867309.88752: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867309.88755: Calling groups_plugins_play to load vars for managed_node2 18662 1726867309.88904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867309.89142: done with get_vars() 18662 1726867309.89150: done queuing things up, now waiting for results queue to drain 18662 1726867309.89151: results queue empty 18662 1726867309.89152: checking for any_errors_fatal 18662 1726867309.89154: done checking for any_errors_fatal 18662 1726867309.89155: checking for max_fail_percentage 18662 1726867309.89155: done checking for max_fail_percentage 18662 1726867309.89156: checking to see if all hosts have failed and the running result is not ok 18662 1726867309.89160: done checking to see if all hosts have failed 18662 1726867309.89160: getting the remaining hosts for this loop 18662 1726867309.89161: done getting the remaining hosts for this loop 18662 1726867309.89162: getting the next task for host managed_node2 18662 1726867309.89165: done getting next task for host managed_node2 18662 1726867309.89166: ^ task is: TASK: Show inside ethernet tests 18662 1726867309.89167: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867309.89168: getting variables 18662 1726867309.89169: in VariableManager get_vars() 18662 1726867309.89174: Calling all_inventory to load vars for managed_node2 18662 1726867309.89175: Calling groups_inventory to load vars for managed_node2 18662 1726867309.89178: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867309.89182: Calling all_plugins_play to load vars for managed_node2 18662 1726867309.89184: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867309.89185: Calling groups_plugins_play to load vars for managed_node2 18662 1726867309.89266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867309.89398: done with get_vars() 18662 1726867309.89403: done getting variables 18662 1726867309.89459: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show inside ethernet tests] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:6 Friday 20 September 2024 17:21:49 -0400 (0:00:01.082) 0:00:04.530 ****** 18662 1726867309.89479: entering _queue_task() for managed_node2/debug 18662 1726867309.89481: Creating lock for debug 18662 1726867309.89685: worker is 1 (out of 1 available) 18662 1726867309.89697: exiting _queue_task() for managed_node2/debug 18662 1726867309.89707: done queuing things up, now waiting for results queue to drain 18662 1726867309.89708: waiting for pending results... 18662 1726867309.89850: running TaskExecutor() for managed_node2/TASK: Show inside ethernet tests 18662 1726867309.89908: in run() - task 0affcac9-a3a5-efab-a8ce-00000000000b 18662 1726867309.89920: variable 'ansible_search_path' from source: unknown 18662 1726867309.89951: calling self._execute() 18662 1726867309.90008: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867309.90015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867309.90023: variable 'omit' from source: magic vars 18662 1726867309.90300: variable 'ansible_distribution_major_version' from source: facts 18662 1726867309.90309: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867309.90317: variable 'omit' from source: magic vars 18662 1726867309.90336: variable 'omit' from source: magic vars 18662 1726867309.90359: variable 'omit' from source: magic vars 18662 1726867309.90393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867309.90422: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867309.90439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867309.90451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867309.90461: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867309.90487: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867309.90490: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867309.90493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867309.90558: Set connection var ansible_timeout to 10 18662 1726867309.90562: Set connection var ansible_connection to ssh 18662 1726867309.90564: Set connection var ansible_shell_executable to /bin/sh 18662 1726867309.90567: Set connection var ansible_shell_type to sh 18662 1726867309.90575: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867309.90583: Set connection var ansible_pipelining to False 18662 1726867309.90603: variable 'ansible_shell_executable' from source: unknown 18662 1726867309.90606: variable 'ansible_connection' from source: unknown 18662 1726867309.90609: variable 'ansible_module_compression' from source: unknown 18662 1726867309.90612: variable 'ansible_shell_type' from source: unknown 18662 1726867309.90614: variable 'ansible_shell_executable' from source: unknown 18662 1726867309.90617: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867309.90620: variable 'ansible_pipelining' from source: unknown 18662 1726867309.90623: variable 'ansible_timeout' from source: unknown 18662 1726867309.90628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867309.90738: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867309.90746: variable 'omit' from source: magic vars 18662 1726867309.90751: starting attempt loop 18662 1726867309.90754: running the handler 18662 1726867309.90790: handler run complete 18662 1726867309.90813: attempt loop complete, returning result 18662 1726867309.90817: _execute() done 18662 1726867309.90819: dumping result to json 18662 1726867309.90821: done dumping result, returning 18662 1726867309.90824: done running TaskExecutor() for managed_node2/TASK: Show inside ethernet tests [0affcac9-a3a5-efab-a8ce-00000000000b] 18662 1726867309.90828: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000000b 18662 1726867309.90913: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000000b 18662 1726867309.90915: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Inside ethernet tests 18662 1726867309.90958: no more pending results, returning what we have 18662 1726867309.90961: results queue empty 18662 1726867309.90962: checking for any_errors_fatal 18662 1726867309.90963: done checking for any_errors_fatal 18662 1726867309.90964: checking for max_fail_percentage 18662 1726867309.90966: done checking for max_fail_percentage 18662 1726867309.90967: checking to see if all hosts have failed and the running result is not ok 18662 1726867309.90967: done checking to see if all hosts have failed 18662 1726867309.90968: getting the remaining hosts for this loop 18662 1726867309.90969: done getting the remaining hosts for this loop 18662 1726867309.90972: getting the next task for host managed_node2 18662 1726867309.90979: done getting next task for host managed_node2 18662 1726867309.90981: ^ task is: TASK: Show network_provider 18662 1726867309.90983: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867309.90986: getting variables 18662 1726867309.90987: in VariableManager get_vars() 18662 1726867309.91009: Calling all_inventory to load vars for managed_node2 18662 1726867309.91012: Calling groups_inventory to load vars for managed_node2 18662 1726867309.91014: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867309.91022: Calling all_plugins_play to load vars for managed_node2 18662 1726867309.91025: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867309.91027: Calling groups_plugins_play to load vars for managed_node2 18662 1726867309.91147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867309.91260: done with get_vars() 18662 1726867309.91266: done getting variables 18662 1726867309.91310: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:9 Friday 20 September 2024 17:21:49 -0400 (0:00:00.018) 0:00:04.548 ****** 18662 1726867309.91327: entering _queue_task() for managed_node2/debug 18662 1726867309.91499: worker is 1 (out of 1 available) 18662 1726867309.91513: exiting _queue_task() for managed_node2/debug 18662 1726867309.91523: done queuing things up, now waiting for results queue to drain 18662 1726867309.91524: waiting for pending results... 18662 1726867309.91661: running TaskExecutor() for managed_node2/TASK: Show network_provider 18662 1726867309.91715: in run() - task 0affcac9-a3a5-efab-a8ce-00000000000c 18662 1726867309.91723: variable 'ansible_search_path' from source: unknown 18662 1726867309.91754: calling self._execute() 18662 1726867309.91811: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867309.91814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867309.91822: variable 'omit' from source: magic vars 18662 1726867309.92073: variable 'ansible_distribution_major_version' from source: facts 18662 1726867309.92088: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867309.92091: variable 'omit' from source: magic vars 18662 1726867309.92112: variable 'omit' from source: magic vars 18662 1726867309.92134: variable 'omit' from source: magic vars 18662 1726867309.92162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867309.92199: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867309.92207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867309.92222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867309.92230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867309.92250: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867309.92253: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867309.92256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867309.92325: Set connection var ansible_timeout to 10 18662 1726867309.92329: Set connection var ansible_connection to ssh 18662 1726867309.92332: Set connection var ansible_shell_executable to /bin/sh 18662 1726867309.92334: Set connection var ansible_shell_type to sh 18662 1726867309.92342: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867309.92347: Set connection var ansible_pipelining to False 18662 1726867309.92365: variable 'ansible_shell_executable' from source: unknown 18662 1726867309.92368: variable 'ansible_connection' from source: unknown 18662 1726867309.92370: variable 'ansible_module_compression' from source: unknown 18662 1726867309.92373: variable 'ansible_shell_type' from source: unknown 18662 1726867309.92375: variable 'ansible_shell_executable' from source: unknown 18662 1726867309.92379: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867309.92381: variable 'ansible_pipelining' from source: unknown 18662 1726867309.92385: variable 'ansible_timeout' from source: unknown 18662 1726867309.92389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867309.92542: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867309.92550: variable 'omit' from source: magic vars 18662 1726867309.92555: starting attempt loop 18662 1726867309.92558: running the handler 18662 1726867309.92594: variable 'network_provider' from source: set_fact 18662 1726867309.92650: variable 'network_provider' from source: set_fact 18662 1726867309.92667: handler run complete 18662 1726867309.92680: attempt loop complete, returning result 18662 1726867309.92683: _execute() done 18662 1726867309.92686: dumping result to json 18662 1726867309.92688: done dumping result, returning 18662 1726867309.92695: done running TaskExecutor() for managed_node2/TASK: Show network_provider [0affcac9-a3a5-efab-a8ce-00000000000c] 18662 1726867309.92698: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000000c 18662 1726867309.92780: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000000c 18662 1726867309.92783: WORKER PROCESS EXITING ok: [managed_node2] => { "network_provider": "nm" } 18662 1726867309.92828: no more pending results, returning what we have 18662 1726867309.92831: results queue empty 18662 1726867309.92832: checking for any_errors_fatal 18662 1726867309.92839: done checking for any_errors_fatal 18662 1726867309.92840: checking for max_fail_percentage 18662 1726867309.92841: done checking for max_fail_percentage 18662 1726867309.92841: checking to see if all hosts have failed and the running result is not ok 18662 1726867309.92842: done checking to see if all hosts have failed 18662 1726867309.92843: getting the remaining hosts for this loop 18662 1726867309.92844: done getting the remaining hosts for this loop 18662 1726867309.92847: getting the next task for host managed_node2 18662 1726867309.92852: done getting next task for host managed_node2 18662 1726867309.92854: ^ task is: TASK: meta (flush_handlers) 18662 1726867309.92856: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867309.92858: getting variables 18662 1726867309.92860: in VariableManager get_vars() 18662 1726867309.92882: Calling all_inventory to load vars for managed_node2 18662 1726867309.92884: Calling groups_inventory to load vars for managed_node2 18662 1726867309.92887: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867309.92896: Calling all_plugins_play to load vars for managed_node2 18662 1726867309.92899: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867309.92901: Calling groups_plugins_play to load vars for managed_node2 18662 1726867309.93039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867309.93146: done with get_vars() 18662 1726867309.93151: done getting variables 18662 1726867309.93193: in VariableManager get_vars() 18662 1726867309.93199: Calling all_inventory to load vars for managed_node2 18662 1726867309.93200: Calling groups_inventory to load vars for managed_node2 18662 1726867309.93202: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867309.93204: Calling all_plugins_play to load vars for managed_node2 18662 1726867309.93206: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867309.93207: Calling groups_plugins_play to load vars for managed_node2 18662 1726867309.93290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867309.93399: done with get_vars() 18662 1726867309.93407: done queuing things up, now waiting for results queue to drain 18662 1726867309.93410: results queue empty 18662 1726867309.93411: checking for any_errors_fatal 18662 1726867309.93412: done checking for any_errors_fatal 18662 1726867309.93413: checking for max_fail_percentage 18662 1726867309.93413: done checking for max_fail_percentage 18662 1726867309.93414: checking to see if all hosts have failed and the running result is not ok 18662 1726867309.93414: done checking to see if all hosts have failed 18662 1726867309.93415: getting the remaining hosts for this loop 18662 1726867309.93415: done getting the remaining hosts for this loop 18662 1726867309.93417: getting the next task for host managed_node2 18662 1726867309.93422: done getting next task for host managed_node2 18662 1726867309.93423: ^ task is: TASK: meta (flush_handlers) 18662 1726867309.93424: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867309.93426: getting variables 18662 1726867309.93426: in VariableManager get_vars() 18662 1726867309.93431: Calling all_inventory to load vars for managed_node2 18662 1726867309.93432: Calling groups_inventory to load vars for managed_node2 18662 1726867309.93433: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867309.93436: Calling all_plugins_play to load vars for managed_node2 18662 1726867309.93438: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867309.93439: Calling groups_plugins_play to load vars for managed_node2 18662 1726867309.93538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867309.93641: done with get_vars() 18662 1726867309.93647: done getting variables 18662 1726867309.93674: in VariableManager get_vars() 18662 1726867309.93682: Calling all_inventory to load vars for managed_node2 18662 1726867309.93684: Calling groups_inventory to load vars for managed_node2 18662 1726867309.93686: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867309.93690: Calling all_plugins_play to load vars for managed_node2 18662 1726867309.93691: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867309.93692: Calling groups_plugins_play to load vars for managed_node2 18662 1726867309.93767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867309.93872: done with get_vars() 18662 1726867309.93881: done queuing things up, now waiting for results queue to drain 18662 1726867309.93882: results queue empty 18662 1726867309.93882: checking for any_errors_fatal 18662 1726867309.93883: done checking for any_errors_fatal 18662 1726867309.93884: checking for max_fail_percentage 18662 1726867309.93884: done checking for max_fail_percentage 18662 1726867309.93884: checking to see if all hosts have failed and the running result is not ok 18662 1726867309.93885: done checking to see if all hosts have failed 18662 1726867309.93885: getting the remaining hosts for this loop 18662 1726867309.93886: done getting the remaining hosts for this loop 18662 1726867309.93887: getting the next task for host managed_node2 18662 1726867309.93889: done getting next task for host managed_node2 18662 1726867309.93889: ^ task is: None 18662 1726867309.93890: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867309.93891: done queuing things up, now waiting for results queue to drain 18662 1726867309.93892: results queue empty 18662 1726867309.93892: checking for any_errors_fatal 18662 1726867309.93892: done checking for any_errors_fatal 18662 1726867309.93893: checking for max_fail_percentage 18662 1726867309.93893: done checking for max_fail_percentage 18662 1726867309.93894: checking to see if all hosts have failed and the running result is not ok 18662 1726867309.93894: done checking to see if all hosts have failed 18662 1726867309.93896: getting the next task for host managed_node2 18662 1726867309.93898: done getting next task for host managed_node2 18662 1726867309.93898: ^ task is: None 18662 1726867309.93899: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867309.93930: in VariableManager get_vars() 18662 1726867309.93940: done with get_vars() 18662 1726867309.93943: in VariableManager get_vars() 18662 1726867309.93949: done with get_vars() 18662 1726867309.93952: variable 'omit' from source: magic vars 18662 1726867309.93968: in VariableManager get_vars() 18662 1726867309.93974: done with get_vars() 18662 1726867309.93987: variable 'omit' from source: magic vars PLAY [Test configuring ethernet devices] *************************************** 18662 1726867309.94100: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18662 1726867309.94124: getting the remaining hosts for this loop 18662 1726867309.94125: done getting the remaining hosts for this loop 18662 1726867309.94127: getting the next task for host managed_node2 18662 1726867309.94129: done getting next task for host managed_node2 18662 1726867309.94130: ^ task is: TASK: Gathering Facts 18662 1726867309.94131: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867309.94132: getting variables 18662 1726867309.94132: in VariableManager get_vars() 18662 1726867309.94138: Calling all_inventory to load vars for managed_node2 18662 1726867309.94139: Calling groups_inventory to load vars for managed_node2 18662 1726867309.94140: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867309.94143: Calling all_plugins_play to load vars for managed_node2 18662 1726867309.94144: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867309.94146: Calling groups_plugins_play to load vars for managed_node2 18662 1726867309.94251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867309.94353: done with get_vars() 18662 1726867309.94358: done getting variables 18662 1726867309.94384: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Friday 20 September 2024 17:21:49 -0400 (0:00:00.030) 0:00:04.579 ****** 18662 1726867309.94399: entering _queue_task() for managed_node2/gather_facts 18662 1726867309.94554: worker is 1 (out of 1 available) 18662 1726867309.94565: exiting _queue_task() for managed_node2/gather_facts 18662 1726867309.94575: done queuing things up, now waiting for results queue to drain 18662 1726867309.94576: waiting for pending results... 18662 1726867309.94717: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18662 1726867309.94775: in run() - task 0affcac9-a3a5-efab-a8ce-0000000000f0 18662 1726867309.94788: variable 'ansible_search_path' from source: unknown 18662 1726867309.94819: calling self._execute() 18662 1726867309.94871: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867309.94875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867309.94886: variable 'omit' from source: magic vars 18662 1726867309.95143: variable 'ansible_distribution_major_version' from source: facts 18662 1726867309.95152: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867309.95157: variable 'omit' from source: magic vars 18662 1726867309.95174: variable 'omit' from source: magic vars 18662 1726867309.95199: variable 'omit' from source: magic vars 18662 1726867309.95230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867309.95259: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867309.95274: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867309.95288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867309.95297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867309.95322: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867309.95325: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867309.95327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867309.95396: Set connection var ansible_timeout to 10 18662 1726867309.95399: Set connection var ansible_connection to ssh 18662 1726867309.95402: Set connection var ansible_shell_executable to /bin/sh 18662 1726867309.95405: Set connection var ansible_shell_type to sh 18662 1726867309.95415: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867309.95420: Set connection var ansible_pipelining to False 18662 1726867309.95437: variable 'ansible_shell_executable' from source: unknown 18662 1726867309.95440: variable 'ansible_connection' from source: unknown 18662 1726867309.95442: variable 'ansible_module_compression' from source: unknown 18662 1726867309.95445: variable 'ansible_shell_type' from source: unknown 18662 1726867309.95447: variable 'ansible_shell_executable' from source: unknown 18662 1726867309.95451: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867309.95454: variable 'ansible_pipelining' from source: unknown 18662 1726867309.95456: variable 'ansible_timeout' from source: unknown 18662 1726867309.95460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867309.95587: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867309.95590: variable 'omit' from source: magic vars 18662 1726867309.95594: starting attempt loop 18662 1726867309.95596: running the handler 18662 1726867309.95610: variable 'ansible_facts' from source: unknown 18662 1726867309.95626: _low_level_execute_command(): starting 18662 1726867309.95633: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867309.96136: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867309.96141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867309.96144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867309.96199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867309.96202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867309.96205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867309.96264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867309.98565: stdout chunk (state=3): >>>/root <<< 18662 1726867309.98712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867309.98744: stderr chunk (state=3): >>><<< 18662 1726867309.98748: stdout chunk (state=3): >>><<< 18662 1726867309.98766: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867309.98779: _low_level_execute_command(): starting 18662 1726867309.98783: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867309.9876606-18875-53870163891977 `" && echo ansible-tmp-1726867309.9876606-18875-53870163891977="` echo /root/.ansible/tmp/ansible-tmp-1726867309.9876606-18875-53870163891977 `" ) && sleep 0' 18662 1726867309.99217: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867309.99220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867309.99223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867309.99232: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867309.99234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867309.99281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867309.99284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867309.99334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867310.02009: stdout chunk (state=3): >>>ansible-tmp-1726867309.9876606-18875-53870163891977=/root/.ansible/tmp/ansible-tmp-1726867309.9876606-18875-53870163891977 <<< 18662 1726867310.02165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867310.02195: stderr chunk (state=3): >>><<< 18662 1726867310.02199: stdout chunk (state=3): >>><<< 18662 1726867310.02219: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867309.9876606-18875-53870163891977=/root/.ansible/tmp/ansible-tmp-1726867309.9876606-18875-53870163891977 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867310.02244: variable 'ansible_module_compression' from source: unknown 18662 1726867310.02285: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18662 1726867310.02336: variable 'ansible_facts' from source: unknown 18662 1726867310.02468: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867309.9876606-18875-53870163891977/AnsiballZ_setup.py 18662 1726867310.02572: Sending initial data 18662 1726867310.02575: Sent initial data (153 bytes) 18662 1726867310.03029: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867310.03032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867310.03035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867310.03037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 18662 1726867310.03039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867310.03041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867310.03089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867310.03106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867310.03142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867310.05332: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867310.05501: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867310.05652: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpfs6jmhgd /root/.ansible/tmp/ansible-tmp-1726867309.9876606-18875-53870163891977/AnsiballZ_setup.py <<< 18662 1726867310.05657: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867309.9876606-18875-53870163891977/AnsiballZ_setup.py" <<< 18662 1726867310.05764: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpfs6jmhgd" to remote "/root/.ansible/tmp/ansible-tmp-1726867309.9876606-18875-53870163891977/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867309.9876606-18875-53870163891977/AnsiballZ_setup.py" <<< 18662 1726867310.07457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867310.07562: stderr chunk (state=3): >>><<< 18662 1726867310.07565: stdout chunk (state=3): >>><<< 18662 1726867310.07568: done transferring module to remote 18662 1726867310.07570: _low_level_execute_command(): starting 18662 1726867310.07573: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867309.9876606-18875-53870163891977/ /root/.ansible/tmp/ansible-tmp-1726867309.9876606-18875-53870163891977/AnsiballZ_setup.py && sleep 0' 18662 1726867310.07952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867310.07955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867310.07957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18662 1726867310.07959: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867310.07965: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867310.08014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867310.08021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867310.08064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867310.11002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867310.11005: stdout chunk (state=3): >>><<< 18662 1726867310.11008: stderr chunk (state=3): >>><<< 18662 1726867310.11010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867310.11012: _low_level_execute_command(): starting 18662 1726867310.11015: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867309.9876606-18875-53870163891977/AnsiballZ_setup.py && sleep 0' 18662 1726867310.11920: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867310.12183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867310.12197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867310.12208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867310.12494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867310.95623: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/<<< 18662 1726867310.95656: stdout chunk (state=3): >>>python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "50", "epoch": "1726867310", "epoch_int": "1726867310", "date": "2024-09-20", "time": "17:21:50", "iso8601_micro": "2024-09-20T21:21:50.558453Z", "iso8601": "2024-09-20T21:21:50Z", "iso8601_basic": "20240920T172150558453", "iso8601_basic_short": "20240920T172150", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.4140625, "5m": 0.38427734375, "15m": 0.20068359375}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2934, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 597, "free": 2934}, "nocache": {"free": 3272, "used": 259}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 548, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794603008, "block_size": 4096, "block_total": 65519099, "block_available": 63914698, "block_used": 1604401, "inode_total": 131070960, "inode_available": 131029049, "inode_used": 41911, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18662 1726867310.98464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867310.98494: stderr chunk (state=3): >>><<< 18662 1726867310.98498: stdout chunk (state=3): >>><<< 18662 1726867310.98527: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "50", "epoch": "1726867310", "epoch_int": "1726867310", "date": "2024-09-20", "time": "17:21:50", "iso8601_micro": "2024-09-20T21:21:50.558453Z", "iso8601": "2024-09-20T21:21:50Z", "iso8601_basic": "20240920T172150558453", "iso8601_basic_short": "20240920T172150", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.4140625, "5m": 0.38427734375, "15m": 0.20068359375}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2934, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 597, "free": 2934}, "nocache": {"free": 3272, "used": 259}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 548, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794603008, "block_size": 4096, "block_total": 65519099, "block_available": 63914698, "block_used": 1604401, "inode_total": 131070960, "inode_available": 131029049, "inode_used": 41911, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867310.98749: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867309.9876606-18875-53870163891977/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867310.98770: _low_level_execute_command(): starting 18662 1726867310.98774: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867309.9876606-18875-53870163891977/ > /dev/null 2>&1 && sleep 0' 18662 1726867310.99222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867310.99225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867310.99229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867310.99231: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867310.99233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867310.99280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867310.99283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867310.99339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867311.02045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867311.02068: stderr chunk (state=3): >>><<< 18662 1726867311.02071: stdout chunk (state=3): >>><<< 18662 1726867311.02085: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867311.02091: handler run complete 18662 1726867311.02169: variable 'ansible_facts' from source: unknown 18662 1726867311.02235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.02415: variable 'ansible_facts' from source: unknown 18662 1726867311.02470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.02545: attempt loop complete, returning result 18662 1726867311.02548: _execute() done 18662 1726867311.02551: dumping result to json 18662 1726867311.02575: done dumping result, returning 18662 1726867311.02583: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-efab-a8ce-0000000000f0] 18662 1726867311.02588: sending task result for task 0affcac9-a3a5-efab-a8ce-0000000000f0 18662 1726867311.02832: done sending task result for task 0affcac9-a3a5-efab-a8ce-0000000000f0 18662 1726867311.02835: WORKER PROCESS EXITING ok: [managed_node2] 18662 1726867311.03044: no more pending results, returning what we have 18662 1726867311.03046: results queue empty 18662 1726867311.03047: checking for any_errors_fatal 18662 1726867311.03048: done checking for any_errors_fatal 18662 1726867311.03048: checking for max_fail_percentage 18662 1726867311.03049: done checking for max_fail_percentage 18662 1726867311.03050: checking to see if all hosts have failed and the running result is not ok 18662 1726867311.03050: done checking to see if all hosts have failed 18662 1726867311.03050: getting the remaining hosts for this loop 18662 1726867311.03051: done getting the remaining hosts for this loop 18662 1726867311.03053: getting the next task for host managed_node2 18662 1726867311.03057: done getting next task for host managed_node2 18662 1726867311.03058: ^ task is: TASK: meta (flush_handlers) 18662 1726867311.03060: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867311.03062: getting variables 18662 1726867311.03063: in VariableManager get_vars() 18662 1726867311.03081: Calling all_inventory to load vars for managed_node2 18662 1726867311.03083: Calling groups_inventory to load vars for managed_node2 18662 1726867311.03085: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.03094: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.03096: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.03098: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.03192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.03302: done with get_vars() 18662 1726867311.03311: done getting variables 18662 1726867311.03358: in VariableManager get_vars() 18662 1726867311.03365: Calling all_inventory to load vars for managed_node2 18662 1726867311.03366: Calling groups_inventory to load vars for managed_node2 18662 1726867311.03368: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.03370: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.03372: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.03373: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.03465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.03571: done with get_vars() 18662 1726867311.03581: done queuing things up, now waiting for results queue to drain 18662 1726867311.03583: results queue empty 18662 1726867311.03583: checking for any_errors_fatal 18662 1726867311.03585: done checking for any_errors_fatal 18662 1726867311.03585: checking for max_fail_percentage 18662 1726867311.03589: done checking for max_fail_percentage 18662 1726867311.03589: checking to see if all hosts have failed and the running result is not ok 18662 1726867311.03590: done checking to see if all hosts have failed 18662 1726867311.03590: getting the remaining hosts for this loop 18662 1726867311.03591: done getting the remaining hosts for this loop 18662 1726867311.03592: getting the next task for host managed_node2 18662 1726867311.03595: done getting next task for host managed_node2 18662 1726867311.03596: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 18662 1726867311.03597: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867311.03598: getting variables 18662 1726867311.03599: in VariableManager get_vars() 18662 1726867311.03603: Calling all_inventory to load vars for managed_node2 18662 1726867311.03605: Calling groups_inventory to load vars for managed_node2 18662 1726867311.03606: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.03609: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.03611: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.03613: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.03692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.03797: done with get_vars() 18662 1726867311.03802: done getting variables 18662 1726867311.03829: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18662 1726867311.03929: variable 'type' from source: play vars 18662 1726867311.03933: variable 'interface' from source: play vars TASK [Set type=veth and interface=lsr27] *************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:20 Friday 20 September 2024 17:21:51 -0400 (0:00:01.095) 0:00:05.675 ****** 18662 1726867311.03960: entering _queue_task() for managed_node2/set_fact 18662 1726867311.04150: worker is 1 (out of 1 available) 18662 1726867311.04161: exiting _queue_task() for managed_node2/set_fact 18662 1726867311.04172: done queuing things up, now waiting for results queue to drain 18662 1726867311.04173: waiting for pending results... 18662 1726867311.04314: running TaskExecutor() for managed_node2/TASK: Set type=veth and interface=lsr27 18662 1726867311.04381: in run() - task 0affcac9-a3a5-efab-a8ce-00000000000f 18662 1726867311.04394: variable 'ansible_search_path' from source: unknown 18662 1726867311.04427: calling self._execute() 18662 1726867311.04481: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.04485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.04496: variable 'omit' from source: magic vars 18662 1726867311.04752: variable 'ansible_distribution_major_version' from source: facts 18662 1726867311.04761: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867311.04766: variable 'omit' from source: magic vars 18662 1726867311.04788: variable 'omit' from source: magic vars 18662 1726867311.04808: variable 'type' from source: play vars 18662 1726867311.04869: variable 'type' from source: play vars 18662 1726867311.04876: variable 'interface' from source: play vars 18662 1726867311.04923: variable 'interface' from source: play vars 18662 1726867311.04936: variable 'omit' from source: magic vars 18662 1726867311.04968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867311.04996: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867311.05014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867311.05028: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867311.05037: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867311.05063: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867311.05066: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.05069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.05136: Set connection var ansible_timeout to 10 18662 1726867311.05139: Set connection var ansible_connection to ssh 18662 1726867311.05144: Set connection var ansible_shell_executable to /bin/sh 18662 1726867311.05146: Set connection var ansible_shell_type to sh 18662 1726867311.05157: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867311.05160: Set connection var ansible_pipelining to False 18662 1726867311.05181: variable 'ansible_shell_executable' from source: unknown 18662 1726867311.05184: variable 'ansible_connection' from source: unknown 18662 1726867311.05186: variable 'ansible_module_compression' from source: unknown 18662 1726867311.05189: variable 'ansible_shell_type' from source: unknown 18662 1726867311.05191: variable 'ansible_shell_executable' from source: unknown 18662 1726867311.05193: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.05197: variable 'ansible_pipelining' from source: unknown 18662 1726867311.05200: variable 'ansible_timeout' from source: unknown 18662 1726867311.05204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.05349: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867311.05356: variable 'omit' from source: magic vars 18662 1726867311.05361: starting attempt loop 18662 1726867311.05364: running the handler 18662 1726867311.05375: handler run complete 18662 1726867311.05386: attempt loop complete, returning result 18662 1726867311.05394: _execute() done 18662 1726867311.05397: dumping result to json 18662 1726867311.05399: done dumping result, returning 18662 1726867311.05407: done running TaskExecutor() for managed_node2/TASK: Set type=veth and interface=lsr27 [0affcac9-a3a5-efab-a8ce-00000000000f] 18662 1726867311.05412: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000000f 18662 1726867311.05484: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000000f 18662 1726867311.05487: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "interface": "lsr27", "type": "veth" }, "changed": false } 18662 1726867311.05538: no more pending results, returning what we have 18662 1726867311.05541: results queue empty 18662 1726867311.05542: checking for any_errors_fatal 18662 1726867311.05544: done checking for any_errors_fatal 18662 1726867311.05544: checking for max_fail_percentage 18662 1726867311.05546: done checking for max_fail_percentage 18662 1726867311.05546: checking to see if all hosts have failed and the running result is not ok 18662 1726867311.05547: done checking to see if all hosts have failed 18662 1726867311.05548: getting the remaining hosts for this loop 18662 1726867311.05549: done getting the remaining hosts for this loop 18662 1726867311.05552: getting the next task for host managed_node2 18662 1726867311.05557: done getting next task for host managed_node2 18662 1726867311.05559: ^ task is: TASK: Include the task 'show_interfaces.yml' 18662 1726867311.05561: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867311.05564: getting variables 18662 1726867311.05565: in VariableManager get_vars() 18662 1726867311.05590: Calling all_inventory to load vars for managed_node2 18662 1726867311.05592: Calling groups_inventory to load vars for managed_node2 18662 1726867311.05595: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.05603: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.05605: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.05610: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.05749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.05857: done with get_vars() 18662 1726867311.05863: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:24 Friday 20 September 2024 17:21:51 -0400 (0:00:00.019) 0:00:05.694 ****** 18662 1726867311.05924: entering _queue_task() for managed_node2/include_tasks 18662 1726867311.06154: worker is 1 (out of 1 available) 18662 1726867311.06166: exiting _queue_task() for managed_node2/include_tasks 18662 1726867311.06180: done queuing things up, now waiting for results queue to drain 18662 1726867311.06182: waiting for pending results... 18662 1726867311.06592: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 18662 1726867311.06598: in run() - task 0affcac9-a3a5-efab-a8ce-000000000010 18662 1726867311.06601: variable 'ansible_search_path' from source: unknown 18662 1726867311.06603: calling self._execute() 18662 1726867311.06676: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.06692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.06706: variable 'omit' from source: magic vars 18662 1726867311.07094: variable 'ansible_distribution_major_version' from source: facts 18662 1726867311.07116: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867311.07127: _execute() done 18662 1726867311.07135: dumping result to json 18662 1726867311.07142: done dumping result, returning 18662 1726867311.07158: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0affcac9-a3a5-efab-a8ce-000000000010] 18662 1726867311.07167: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000010 18662 1726867311.07483: no more pending results, returning what we have 18662 1726867311.07487: in VariableManager get_vars() 18662 1726867311.07516: Calling all_inventory to load vars for managed_node2 18662 1726867311.07519: Calling groups_inventory to load vars for managed_node2 18662 1726867311.07522: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.07530: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.07533: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.07535: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.07682: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000010 18662 1726867311.07685: WORKER PROCESS EXITING 18662 1726867311.07697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.07860: done with get_vars() 18662 1726867311.07867: variable 'ansible_search_path' from source: unknown 18662 1726867311.07881: we have included files to process 18662 1726867311.07882: generating all_blocks data 18662 1726867311.07884: done generating all_blocks data 18662 1726867311.07885: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18662 1726867311.07886: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18662 1726867311.07888: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18662 1726867311.08029: in VariableManager get_vars() 18662 1726867311.08043: done with get_vars() 18662 1726867311.08142: done processing included file 18662 1726867311.08144: iterating over new_blocks loaded from include file 18662 1726867311.08146: in VariableManager get_vars() 18662 1726867311.08156: done with get_vars() 18662 1726867311.08157: filtering new block on tags 18662 1726867311.08172: done filtering new block on tags 18662 1726867311.08175: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 18662 1726867311.08182: extending task lists for all hosts with included blocks 18662 1726867311.08501: done extending task lists 18662 1726867311.08502: done processing included files 18662 1726867311.08503: results queue empty 18662 1726867311.08503: checking for any_errors_fatal 18662 1726867311.08506: done checking for any_errors_fatal 18662 1726867311.08507: checking for max_fail_percentage 18662 1726867311.08510: done checking for max_fail_percentage 18662 1726867311.08511: checking to see if all hosts have failed and the running result is not ok 18662 1726867311.08511: done checking to see if all hosts have failed 18662 1726867311.08512: getting the remaining hosts for this loop 18662 1726867311.08513: done getting the remaining hosts for this loop 18662 1726867311.08516: getting the next task for host managed_node2 18662 1726867311.08519: done getting next task for host managed_node2 18662 1726867311.08521: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 18662 1726867311.08523: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867311.08525: getting variables 18662 1726867311.08526: in VariableManager get_vars() 18662 1726867311.08534: Calling all_inventory to load vars for managed_node2 18662 1726867311.08536: Calling groups_inventory to load vars for managed_node2 18662 1726867311.08538: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.08542: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.08544: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.08547: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.08695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.08871: done with get_vars() 18662 1726867311.08882: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 17:21:51 -0400 (0:00:00.030) 0:00:05.724 ****** 18662 1726867311.08950: entering _queue_task() for managed_node2/include_tasks 18662 1726867311.09182: worker is 1 (out of 1 available) 18662 1726867311.09193: exiting _queue_task() for managed_node2/include_tasks 18662 1726867311.09204: done queuing things up, now waiting for results queue to drain 18662 1726867311.09205: waiting for pending results... 18662 1726867311.09436: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 18662 1726867311.09536: in run() - task 0affcac9-a3a5-efab-a8ce-000000000104 18662 1726867311.09560: variable 'ansible_search_path' from source: unknown 18662 1726867311.09569: variable 'ansible_search_path' from source: unknown 18662 1726867311.09613: calling self._execute() 18662 1726867311.09698: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.09714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.09731: variable 'omit' from source: magic vars 18662 1726867311.10031: variable 'ansible_distribution_major_version' from source: facts 18662 1726867311.10048: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867311.10051: _execute() done 18662 1726867311.10055: dumping result to json 18662 1726867311.10058: done dumping result, returning 18662 1726867311.10065: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0affcac9-a3a5-efab-a8ce-000000000104] 18662 1726867311.10069: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000104 18662 1726867311.10150: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000104 18662 1726867311.10153: WORKER PROCESS EXITING 18662 1726867311.10217: no more pending results, returning what we have 18662 1726867311.10220: in VariableManager get_vars() 18662 1726867311.10245: Calling all_inventory to load vars for managed_node2 18662 1726867311.10247: Calling groups_inventory to load vars for managed_node2 18662 1726867311.10250: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.10258: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.10260: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.10265: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.10371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.10506: done with get_vars() 18662 1726867311.10513: variable 'ansible_search_path' from source: unknown 18662 1726867311.10514: variable 'ansible_search_path' from source: unknown 18662 1726867311.10537: we have included files to process 18662 1726867311.10538: generating all_blocks data 18662 1726867311.10539: done generating all_blocks data 18662 1726867311.10540: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18662 1726867311.10541: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18662 1726867311.10542: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18662 1726867311.10746: done processing included file 18662 1726867311.10748: iterating over new_blocks loaded from include file 18662 1726867311.10749: in VariableManager get_vars() 18662 1726867311.10756: done with get_vars() 18662 1726867311.10758: filtering new block on tags 18662 1726867311.10767: done filtering new block on tags 18662 1726867311.10769: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 18662 1726867311.10771: extending task lists for all hosts with included blocks 18662 1726867311.10834: done extending task lists 18662 1726867311.10835: done processing included files 18662 1726867311.10836: results queue empty 18662 1726867311.10836: checking for any_errors_fatal 18662 1726867311.10838: done checking for any_errors_fatal 18662 1726867311.10838: checking for max_fail_percentage 18662 1726867311.10839: done checking for max_fail_percentage 18662 1726867311.10839: checking to see if all hosts have failed and the running result is not ok 18662 1726867311.10840: done checking to see if all hosts have failed 18662 1726867311.10840: getting the remaining hosts for this loop 18662 1726867311.10841: done getting the remaining hosts for this loop 18662 1726867311.10842: getting the next task for host managed_node2 18662 1726867311.10845: done getting next task for host managed_node2 18662 1726867311.10846: ^ task is: TASK: Gather current interface info 18662 1726867311.10847: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867311.10849: getting variables 18662 1726867311.10850: in VariableManager get_vars() 18662 1726867311.10855: Calling all_inventory to load vars for managed_node2 18662 1726867311.10857: Calling groups_inventory to load vars for managed_node2 18662 1726867311.10858: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.10861: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.10863: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.10864: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.10947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.11053: done with get_vars() 18662 1726867311.11059: done getting variables 18662 1726867311.11086: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 17:21:51 -0400 (0:00:00.021) 0:00:05.746 ****** 18662 1726867311.11104: entering _queue_task() for managed_node2/command 18662 1726867311.11282: worker is 1 (out of 1 available) 18662 1726867311.11292: exiting _queue_task() for managed_node2/command 18662 1726867311.11303: done queuing things up, now waiting for results queue to drain 18662 1726867311.11304: waiting for pending results... 18662 1726867311.11448: running TaskExecutor() for managed_node2/TASK: Gather current interface info 18662 1726867311.11533: in run() - task 0affcac9-a3a5-efab-a8ce-000000000115 18662 1726867311.11545: variable 'ansible_search_path' from source: unknown 18662 1726867311.11549: variable 'ansible_search_path' from source: unknown 18662 1726867311.11588: calling self._execute() 18662 1726867311.11644: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.11650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.11686: variable 'omit' from source: magic vars 18662 1726867311.12046: variable 'ansible_distribution_major_version' from source: facts 18662 1726867311.12083: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867311.12088: variable 'omit' from source: magic vars 18662 1726867311.12282: variable 'omit' from source: magic vars 18662 1726867311.12285: variable 'omit' from source: magic vars 18662 1726867311.12288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867311.12290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867311.12292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867311.12294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867311.12296: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867311.12351: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867311.12360: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.12368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.12482: Set connection var ansible_timeout to 10 18662 1726867311.12491: Set connection var ansible_connection to ssh 18662 1726867311.12501: Set connection var ansible_shell_executable to /bin/sh 18662 1726867311.12507: Set connection var ansible_shell_type to sh 18662 1726867311.12638: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867311.12642: Set connection var ansible_pipelining to False 18662 1726867311.12644: variable 'ansible_shell_executable' from source: unknown 18662 1726867311.12647: variable 'ansible_connection' from source: unknown 18662 1726867311.12649: variable 'ansible_module_compression' from source: unknown 18662 1726867311.12651: variable 'ansible_shell_type' from source: unknown 18662 1726867311.12653: variable 'ansible_shell_executable' from source: unknown 18662 1726867311.12655: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.12657: variable 'ansible_pipelining' from source: unknown 18662 1726867311.12659: variable 'ansible_timeout' from source: unknown 18662 1726867311.12661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.12784: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867311.12803: variable 'omit' from source: magic vars 18662 1726867311.12806: starting attempt loop 18662 1726867311.12812: running the handler 18662 1726867311.12824: _low_level_execute_command(): starting 18662 1726867311.12831: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867311.13362: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867311.13372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867311.13388: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867311.13437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867311.13440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867311.13485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 18662 1726867311.15625: stdout chunk (state=3): >>>/root <<< 18662 1726867311.15796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867311.15822: stderr chunk (state=3): >>><<< 18662 1726867311.15834: stdout chunk (state=3): >>><<< 18662 1726867311.15980: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 18662 1726867311.15984: _low_level_execute_command(): starting 18662 1726867311.15987: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867311.1587372-18925-161223481257533 `" && echo ansible-tmp-1726867311.1587372-18925-161223481257533="` echo /root/.ansible/tmp/ansible-tmp-1726867311.1587372-18925-161223481257533 `" ) && sleep 0' 18662 1726867311.16621: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867311.16680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867311.16703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867311.16725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867311.16826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867311.18762: stdout chunk (state=3): >>>ansible-tmp-1726867311.1587372-18925-161223481257533=/root/.ansible/tmp/ansible-tmp-1726867311.1587372-18925-161223481257533 <<< 18662 1726867311.18895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867311.19090: stderr chunk (state=3): >>><<< 18662 1726867311.19093: stdout chunk (state=3): >>><<< 18662 1726867311.19096: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867311.1587372-18925-161223481257533=/root/.ansible/tmp/ansible-tmp-1726867311.1587372-18925-161223481257533 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867311.19099: variable 'ansible_module_compression' from source: unknown 18662 1726867311.19103: ANSIBALLZ: Using generic lock for ansible.legacy.command 18662 1726867311.19105: ANSIBALLZ: Acquiring lock 18662 1726867311.19107: ANSIBALLZ: Lock acquired: 140264020905808 18662 1726867311.19109: ANSIBALLZ: Creating module 18662 1726867311.39119: ANSIBALLZ: Writing module into payload 18662 1726867311.39208: ANSIBALLZ: Writing module 18662 1726867311.39232: ANSIBALLZ: Renaming module 18662 1726867311.39238: ANSIBALLZ: Done creating module 18662 1726867311.39254: variable 'ansible_facts' from source: unknown 18662 1726867311.39532: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867311.1587372-18925-161223481257533/AnsiballZ_command.py 18662 1726867311.39801: Sending initial data 18662 1726867311.39804: Sent initial data (156 bytes) 18662 1726867311.40411: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867311.40424: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867311.40435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867311.40451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867311.40491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867311.40550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867311.40563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867311.40588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867311.40653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867311.42325: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867311.42373: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867311.42414: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpawhx1bqb /root/.ansible/tmp/ansible-tmp-1726867311.1587372-18925-161223481257533/AnsiballZ_command.py <<< 18662 1726867311.42417: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867311.1587372-18925-161223481257533/AnsiballZ_command.py" <<< 18662 1726867311.42485: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpawhx1bqb" to remote "/root/.ansible/tmp/ansible-tmp-1726867311.1587372-18925-161223481257533/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867311.1587372-18925-161223481257533/AnsiballZ_command.py" <<< 18662 1726867311.43234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867311.43237: stderr chunk (state=3): >>><<< 18662 1726867311.43239: stdout chunk (state=3): >>><<< 18662 1726867311.43293: done transferring module to remote 18662 1726867311.43312: _low_level_execute_command(): starting 18662 1726867311.43323: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867311.1587372-18925-161223481257533/ /root/.ansible/tmp/ansible-tmp-1726867311.1587372-18925-161223481257533/AnsiballZ_command.py && sleep 0' 18662 1726867311.43951: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867311.43964: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867311.43992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867311.44075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867311.44102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867311.44147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867311.44194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867311.46097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867311.46138: stderr chunk (state=3): >>><<< 18662 1726867311.46142: stdout chunk (state=3): >>><<< 18662 1726867311.46145: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867311.46147: _low_level_execute_command(): starting 18662 1726867311.46150: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867311.1587372-18925-161223481257533/AnsiballZ_command.py && sleep 0' 18662 1726867311.46584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867311.46588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867311.46590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867311.46592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867311.46638: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867311.46645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867311.46693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867311.62563: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:21:51.621117", "end": "2024-09-20 17:21:51.624555", "delta": "0:00:00.003438", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18662 1726867311.64184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867311.64209: stderr chunk (state=3): >>><<< 18662 1726867311.64212: stdout chunk (state=3): >>><<< 18662 1726867311.64229: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:21:51.621117", "end": "2024-09-20 17:21:51.624555", "delta": "0:00:00.003438", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867311.64261: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867311.1587372-18925-161223481257533/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867311.64264: _low_level_execute_command(): starting 18662 1726867311.64269: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867311.1587372-18925-161223481257533/ > /dev/null 2>&1 && sleep 0' 18662 1726867311.64676: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867311.64717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867311.64720: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867311.64723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 18662 1726867311.64725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867311.64733: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867311.64767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867311.64770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867311.64820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867311.66693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867311.66712: stderr chunk (state=3): >>><<< 18662 1726867311.66716: stdout chunk (state=3): >>><<< 18662 1726867311.66726: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867311.66731: handler run complete 18662 1726867311.66751: Evaluated conditional (False): False 18662 1726867311.66760: attempt loop complete, returning result 18662 1726867311.66763: _execute() done 18662 1726867311.66765: dumping result to json 18662 1726867311.66769: done dumping result, returning 18662 1726867311.66778: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0affcac9-a3a5-efab-a8ce-000000000115] 18662 1726867311.66783: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000115 18662 1726867311.66874: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000115 18662 1726867311.66876: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003438", "end": "2024-09-20 17:21:51.624555", "rc": 0, "start": "2024-09-20 17:21:51.621117" } STDOUT: bonding_masters eth0 lo 18662 1726867311.66947: no more pending results, returning what we have 18662 1726867311.66951: results queue empty 18662 1726867311.66951: checking for any_errors_fatal 18662 1726867311.66953: done checking for any_errors_fatal 18662 1726867311.66954: checking for max_fail_percentage 18662 1726867311.66955: done checking for max_fail_percentage 18662 1726867311.66955: checking to see if all hosts have failed and the running result is not ok 18662 1726867311.66956: done checking to see if all hosts have failed 18662 1726867311.66957: getting the remaining hosts for this loop 18662 1726867311.66958: done getting the remaining hosts for this loop 18662 1726867311.66961: getting the next task for host managed_node2 18662 1726867311.66967: done getting next task for host managed_node2 18662 1726867311.66969: ^ task is: TASK: Set current_interfaces 18662 1726867311.66973: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867311.66976: getting variables 18662 1726867311.67049: in VariableManager get_vars() 18662 1726867311.67084: Calling all_inventory to load vars for managed_node2 18662 1726867311.67090: Calling groups_inventory to load vars for managed_node2 18662 1726867311.67094: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.67104: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.67106: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.67110: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.67239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.67354: done with get_vars() 18662 1726867311.67361: done getting variables 18662 1726867311.67404: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 17:21:51 -0400 (0:00:00.563) 0:00:06.309 ****** 18662 1726867311.67427: entering _queue_task() for managed_node2/set_fact 18662 1726867311.67621: worker is 1 (out of 1 available) 18662 1726867311.67632: exiting _queue_task() for managed_node2/set_fact 18662 1726867311.67642: done queuing things up, now waiting for results queue to drain 18662 1726867311.67643: waiting for pending results... 18662 1726867311.67785: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 18662 1726867311.67847: in run() - task 0affcac9-a3a5-efab-a8ce-000000000116 18662 1726867311.67857: variable 'ansible_search_path' from source: unknown 18662 1726867311.67861: variable 'ansible_search_path' from source: unknown 18662 1726867311.67892: calling self._execute() 18662 1726867311.67953: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.67956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.67966: variable 'omit' from source: magic vars 18662 1726867311.68382: variable 'ansible_distribution_major_version' from source: facts 18662 1726867311.68387: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867311.68390: variable 'omit' from source: magic vars 18662 1726867311.68393: variable 'omit' from source: magic vars 18662 1726867311.68431: variable '_current_interfaces' from source: set_fact 18662 1726867311.68492: variable 'omit' from source: magic vars 18662 1726867311.68533: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867311.68568: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867311.68592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867311.68616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867311.68625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867311.68654: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867311.68660: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.68667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.68756: Set connection var ansible_timeout to 10 18662 1726867311.68763: Set connection var ansible_connection to ssh 18662 1726867311.68771: Set connection var ansible_shell_executable to /bin/sh 18662 1726867311.68776: Set connection var ansible_shell_type to sh 18662 1726867311.68791: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867311.68798: Set connection var ansible_pipelining to False 18662 1726867311.68824: variable 'ansible_shell_executable' from source: unknown 18662 1726867311.68832: variable 'ansible_connection' from source: unknown 18662 1726867311.68837: variable 'ansible_module_compression' from source: unknown 18662 1726867311.68842: variable 'ansible_shell_type' from source: unknown 18662 1726867311.68847: variable 'ansible_shell_executable' from source: unknown 18662 1726867311.68852: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.68858: variable 'ansible_pipelining' from source: unknown 18662 1726867311.68863: variable 'ansible_timeout' from source: unknown 18662 1726867311.68869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.68992: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867311.69006: variable 'omit' from source: magic vars 18662 1726867311.69017: starting attempt loop 18662 1726867311.69023: running the handler 18662 1726867311.69035: handler run complete 18662 1726867311.69047: attempt loop complete, returning result 18662 1726867311.69052: _execute() done 18662 1726867311.69058: dumping result to json 18662 1726867311.69064: done dumping result, returning 18662 1726867311.69072: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0affcac9-a3a5-efab-a8ce-000000000116] 18662 1726867311.69081: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000116 ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 18662 1726867311.69210: no more pending results, returning what we have 18662 1726867311.69213: results queue empty 18662 1726867311.69214: checking for any_errors_fatal 18662 1726867311.69221: done checking for any_errors_fatal 18662 1726867311.69221: checking for max_fail_percentage 18662 1726867311.69223: done checking for max_fail_percentage 18662 1726867311.69223: checking to see if all hosts have failed and the running result is not ok 18662 1726867311.69224: done checking to see if all hosts have failed 18662 1726867311.69225: getting the remaining hosts for this loop 18662 1726867311.69226: done getting the remaining hosts for this loop 18662 1726867311.69229: getting the next task for host managed_node2 18662 1726867311.69236: done getting next task for host managed_node2 18662 1726867311.69239: ^ task is: TASK: Show current_interfaces 18662 1726867311.69241: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867311.69245: getting variables 18662 1726867311.69246: in VariableManager get_vars() 18662 1726867311.69388: Calling all_inventory to load vars for managed_node2 18662 1726867311.69391: Calling groups_inventory to load vars for managed_node2 18662 1726867311.69393: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.69399: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000116 18662 1726867311.69401: WORKER PROCESS EXITING 18662 1726867311.69408: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.69411: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.69414: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.69572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.69789: done with get_vars() 18662 1726867311.69797: done getting variables 18662 1726867311.69848: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 17:21:51 -0400 (0:00:00.024) 0:00:06.334 ****** 18662 1726867311.69872: entering _queue_task() for managed_node2/debug 18662 1726867311.70094: worker is 1 (out of 1 available) 18662 1726867311.70104: exiting _queue_task() for managed_node2/debug 18662 1726867311.70186: done queuing things up, now waiting for results queue to drain 18662 1726867311.70188: waiting for pending results... 18662 1726867311.70339: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 18662 1726867311.70399: in run() - task 0affcac9-a3a5-efab-a8ce-000000000105 18662 1726867311.70414: variable 'ansible_search_path' from source: unknown 18662 1726867311.70417: variable 'ansible_search_path' from source: unknown 18662 1726867311.70444: calling self._execute() 18662 1726867311.70503: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.70507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.70520: variable 'omit' from source: magic vars 18662 1726867311.70770: variable 'ansible_distribution_major_version' from source: facts 18662 1726867311.70786: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867311.70791: variable 'omit' from source: magic vars 18662 1726867311.70817: variable 'omit' from source: magic vars 18662 1726867311.70884: variable 'current_interfaces' from source: set_fact 18662 1726867311.70904: variable 'omit' from source: magic vars 18662 1726867311.70935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867311.70961: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867311.70979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867311.70993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867311.71001: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867311.71024: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867311.71027: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.71030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.71098: Set connection var ansible_timeout to 10 18662 1726867311.71101: Set connection var ansible_connection to ssh 18662 1726867311.71106: Set connection var ansible_shell_executable to /bin/sh 18662 1726867311.71111: Set connection var ansible_shell_type to sh 18662 1726867311.71118: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867311.71122: Set connection var ansible_pipelining to False 18662 1726867311.71140: variable 'ansible_shell_executable' from source: unknown 18662 1726867311.71143: variable 'ansible_connection' from source: unknown 18662 1726867311.71146: variable 'ansible_module_compression' from source: unknown 18662 1726867311.71149: variable 'ansible_shell_type' from source: unknown 18662 1726867311.71151: variable 'ansible_shell_executable' from source: unknown 18662 1726867311.71153: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.71155: variable 'ansible_pipelining' from source: unknown 18662 1726867311.71157: variable 'ansible_timeout' from source: unknown 18662 1726867311.71162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.71259: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867311.71267: variable 'omit' from source: magic vars 18662 1726867311.71272: starting attempt loop 18662 1726867311.71276: running the handler 18662 1726867311.71314: handler run complete 18662 1726867311.71323: attempt loop complete, returning result 18662 1726867311.71326: _execute() done 18662 1726867311.71328: dumping result to json 18662 1726867311.71331: done dumping result, returning 18662 1726867311.71338: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0affcac9-a3a5-efab-a8ce-000000000105] 18662 1726867311.71341: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000105 18662 1726867311.71420: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000105 18662 1726867311.71423: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 18662 1726867311.71467: no more pending results, returning what we have 18662 1726867311.71470: results queue empty 18662 1726867311.71471: checking for any_errors_fatal 18662 1726867311.71479: done checking for any_errors_fatal 18662 1726867311.71480: checking for max_fail_percentage 18662 1726867311.71481: done checking for max_fail_percentage 18662 1726867311.71482: checking to see if all hosts have failed and the running result is not ok 18662 1726867311.71482: done checking to see if all hosts have failed 18662 1726867311.71483: getting the remaining hosts for this loop 18662 1726867311.71484: done getting the remaining hosts for this loop 18662 1726867311.71487: getting the next task for host managed_node2 18662 1726867311.71494: done getting next task for host managed_node2 18662 1726867311.71496: ^ task is: TASK: Include the task 'manage_test_interface.yml' 18662 1726867311.71497: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867311.71500: getting variables 18662 1726867311.71502: in VariableManager get_vars() 18662 1726867311.71526: Calling all_inventory to load vars for managed_node2 18662 1726867311.71528: Calling groups_inventory to load vars for managed_node2 18662 1726867311.71532: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.71540: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.71543: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.71545: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.71659: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.71768: done with get_vars() 18662 1726867311.71775: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:26 Friday 20 September 2024 17:21:51 -0400 (0:00:00.019) 0:00:06.353 ****** 18662 1726867311.71838: entering _queue_task() for managed_node2/include_tasks 18662 1726867311.72021: worker is 1 (out of 1 available) 18662 1726867311.72033: exiting _queue_task() for managed_node2/include_tasks 18662 1726867311.72044: done queuing things up, now waiting for results queue to drain 18662 1726867311.72045: waiting for pending results... 18662 1726867311.72181: running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' 18662 1726867311.72228: in run() - task 0affcac9-a3a5-efab-a8ce-000000000011 18662 1726867311.72239: variable 'ansible_search_path' from source: unknown 18662 1726867311.72268: calling self._execute() 18662 1726867311.72324: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.72327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.72337: variable 'omit' from source: magic vars 18662 1726867311.72581: variable 'ansible_distribution_major_version' from source: facts 18662 1726867311.72589: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867311.72597: _execute() done 18662 1726867311.72600: dumping result to json 18662 1726867311.72604: done dumping result, returning 18662 1726867311.72607: done running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' [0affcac9-a3a5-efab-a8ce-000000000011] 18662 1726867311.72612: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000011 18662 1726867311.72698: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000011 18662 1726867311.72701: WORKER PROCESS EXITING 18662 1726867311.72758: no more pending results, returning what we have 18662 1726867311.72762: in VariableManager get_vars() 18662 1726867311.72790: Calling all_inventory to load vars for managed_node2 18662 1726867311.72792: Calling groups_inventory to load vars for managed_node2 18662 1726867311.72795: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.72803: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.72805: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.72810: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.72954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.73062: done with get_vars() 18662 1726867311.73068: variable 'ansible_search_path' from source: unknown 18662 1726867311.73076: we have included files to process 18662 1726867311.73080: generating all_blocks data 18662 1726867311.73081: done generating all_blocks data 18662 1726867311.73084: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 18662 1726867311.73085: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 18662 1726867311.73086: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 18662 1726867311.73402: in VariableManager get_vars() 18662 1726867311.73414: done with get_vars() 18662 1726867311.73553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 18662 1726867311.73936: done processing included file 18662 1726867311.73938: iterating over new_blocks loaded from include file 18662 1726867311.73938: in VariableManager get_vars() 18662 1726867311.73945: done with get_vars() 18662 1726867311.73946: filtering new block on tags 18662 1726867311.73964: done filtering new block on tags 18662 1726867311.73965: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node2 18662 1726867311.73968: extending task lists for all hosts with included blocks 18662 1726867311.74067: done extending task lists 18662 1726867311.74068: done processing included files 18662 1726867311.74068: results queue empty 18662 1726867311.74069: checking for any_errors_fatal 18662 1726867311.74070: done checking for any_errors_fatal 18662 1726867311.74071: checking for max_fail_percentage 18662 1726867311.74072: done checking for max_fail_percentage 18662 1726867311.74072: checking to see if all hosts have failed and the running result is not ok 18662 1726867311.74072: done checking to see if all hosts have failed 18662 1726867311.74073: getting the remaining hosts for this loop 18662 1726867311.74074: done getting the remaining hosts for this loop 18662 1726867311.74075: getting the next task for host managed_node2 18662 1726867311.74079: done getting next task for host managed_node2 18662 1726867311.74080: ^ task is: TASK: Ensure state in ["present", "absent"] 18662 1726867311.74082: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867311.74083: getting variables 18662 1726867311.74084: in VariableManager get_vars() 18662 1726867311.74089: Calling all_inventory to load vars for managed_node2 18662 1726867311.74090: Calling groups_inventory to load vars for managed_node2 18662 1726867311.74092: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.74095: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.74096: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.74098: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.74179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.74285: done with get_vars() 18662 1726867311.74291: done getting variables 18662 1726867311.74335: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 17:21:51 -0400 (0:00:00.025) 0:00:06.379 ****** 18662 1726867311.74354: entering _queue_task() for managed_node2/fail 18662 1726867311.74356: Creating lock for fail 18662 1726867311.74546: worker is 1 (out of 1 available) 18662 1726867311.74558: exiting _queue_task() for managed_node2/fail 18662 1726867311.74568: done queuing things up, now waiting for results queue to drain 18662 1726867311.74570: waiting for pending results... 18662 1726867311.74717: running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] 18662 1726867311.74770: in run() - task 0affcac9-a3a5-efab-a8ce-000000000131 18662 1726867311.74782: variable 'ansible_search_path' from source: unknown 18662 1726867311.74785: variable 'ansible_search_path' from source: unknown 18662 1726867311.74815: calling self._execute() 18662 1726867311.74865: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.74871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.74883: variable 'omit' from source: magic vars 18662 1726867311.75283: variable 'ansible_distribution_major_version' from source: facts 18662 1726867311.75286: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867311.75366: variable 'state' from source: include params 18662 1726867311.75382: Evaluated conditional (state not in ["present", "absent"]): False 18662 1726867311.75391: when evaluation is False, skipping this task 18662 1726867311.75399: _execute() done 18662 1726867311.75406: dumping result to json 18662 1726867311.75414: done dumping result, returning 18662 1726867311.75424: done running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] [0affcac9-a3a5-efab-a8ce-000000000131] 18662 1726867311.75485: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000131 skipping: [managed_node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 18662 1726867311.75761: no more pending results, returning what we have 18662 1726867311.75764: results queue empty 18662 1726867311.75765: checking for any_errors_fatal 18662 1726867311.75766: done checking for any_errors_fatal 18662 1726867311.75767: checking for max_fail_percentage 18662 1726867311.75768: done checking for max_fail_percentage 18662 1726867311.75769: checking to see if all hosts have failed and the running result is not ok 18662 1726867311.75770: done checking to see if all hosts have failed 18662 1726867311.75771: getting the remaining hosts for this loop 18662 1726867311.75772: done getting the remaining hosts for this loop 18662 1726867311.75775: getting the next task for host managed_node2 18662 1726867311.75782: done getting next task for host managed_node2 18662 1726867311.75784: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 18662 1726867311.75786: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867311.75789: getting variables 18662 1726867311.75790: in VariableManager get_vars() 18662 1726867311.75813: Calling all_inventory to load vars for managed_node2 18662 1726867311.75815: Calling groups_inventory to load vars for managed_node2 18662 1726867311.75818: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.75827: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.75829: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.75832: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.76008: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000131 18662 1726867311.76011: WORKER PROCESS EXITING 18662 1726867311.76033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.76167: done with get_vars() 18662 1726867311.76173: done getting variables 18662 1726867311.76219: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 17:21:51 -0400 (0:00:00.018) 0:00:06.397 ****** 18662 1726867311.76237: entering _queue_task() for managed_node2/fail 18662 1726867311.76408: worker is 1 (out of 1 available) 18662 1726867311.76419: exiting _queue_task() for managed_node2/fail 18662 1726867311.76429: done queuing things up, now waiting for results queue to drain 18662 1726867311.76431: waiting for pending results... 18662 1726867311.76576: running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] 18662 1726867311.76632: in run() - task 0affcac9-a3a5-efab-a8ce-000000000132 18662 1726867311.76642: variable 'ansible_search_path' from source: unknown 18662 1726867311.76645: variable 'ansible_search_path' from source: unknown 18662 1726867311.76675: calling self._execute() 18662 1726867311.76741: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.76745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.76753: variable 'omit' from source: magic vars 18662 1726867311.77005: variable 'ansible_distribution_major_version' from source: facts 18662 1726867311.77017: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867311.77109: variable 'type' from source: set_fact 18662 1726867311.77116: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 18662 1726867311.77119: when evaluation is False, skipping this task 18662 1726867311.77122: _execute() done 18662 1726867311.77124: dumping result to json 18662 1726867311.77126: done dumping result, returning 18662 1726867311.77133: done running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcac9-a3a5-efab-a8ce-000000000132] 18662 1726867311.77137: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000132 18662 1726867311.77213: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000132 18662 1726867311.77216: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 18662 1726867311.77260: no more pending results, returning what we have 18662 1726867311.77264: results queue empty 18662 1726867311.77265: checking for any_errors_fatal 18662 1726867311.77269: done checking for any_errors_fatal 18662 1726867311.77269: checking for max_fail_percentage 18662 1726867311.77271: done checking for max_fail_percentage 18662 1726867311.77271: checking to see if all hosts have failed and the running result is not ok 18662 1726867311.77272: done checking to see if all hosts have failed 18662 1726867311.77273: getting the remaining hosts for this loop 18662 1726867311.77274: done getting the remaining hosts for this loop 18662 1726867311.77279: getting the next task for host managed_node2 18662 1726867311.77285: done getting next task for host managed_node2 18662 1726867311.77286: ^ task is: TASK: Include the task 'show_interfaces.yml' 18662 1726867311.77289: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867311.77292: getting variables 18662 1726867311.77293: in VariableManager get_vars() 18662 1726867311.77314: Calling all_inventory to load vars for managed_node2 18662 1726867311.77317: Calling groups_inventory to load vars for managed_node2 18662 1726867311.77319: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.77334: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.77337: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.77340: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.77450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.77580: done with get_vars() 18662 1726867311.77586: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 17:21:51 -0400 (0:00:00.014) 0:00:06.411 ****** 18662 1726867311.77642: entering _queue_task() for managed_node2/include_tasks 18662 1726867311.77802: worker is 1 (out of 1 available) 18662 1726867311.77813: exiting _queue_task() for managed_node2/include_tasks 18662 1726867311.77824: done queuing things up, now waiting for results queue to drain 18662 1726867311.77825: waiting for pending results... 18662 1726867311.78192: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 18662 1726867311.78197: in run() - task 0affcac9-a3a5-efab-a8ce-000000000133 18662 1726867311.78200: variable 'ansible_search_path' from source: unknown 18662 1726867311.78202: variable 'ansible_search_path' from source: unknown 18662 1726867311.78205: calling self._execute() 18662 1726867311.78207: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.78212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.78214: variable 'omit' from source: magic vars 18662 1726867311.78534: variable 'ansible_distribution_major_version' from source: facts 18662 1726867311.78547: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867311.78555: _execute() done 18662 1726867311.78561: dumping result to json 18662 1726867311.78566: done dumping result, returning 18662 1726867311.78574: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0affcac9-a3a5-efab-a8ce-000000000133] 18662 1726867311.78585: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000133 18662 1726867311.78681: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000133 18662 1726867311.78689: WORKER PROCESS EXITING 18662 1726867311.78729: no more pending results, returning what we have 18662 1726867311.78735: in VariableManager get_vars() 18662 1726867311.78766: Calling all_inventory to load vars for managed_node2 18662 1726867311.78768: Calling groups_inventory to load vars for managed_node2 18662 1726867311.78772: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.78785: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.78788: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.78790: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.79070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.79284: done with get_vars() 18662 1726867311.79290: variable 'ansible_search_path' from source: unknown 18662 1726867311.79291: variable 'ansible_search_path' from source: unknown 18662 1726867311.79328: we have included files to process 18662 1726867311.79330: generating all_blocks data 18662 1726867311.79331: done generating all_blocks data 18662 1726867311.79336: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18662 1726867311.79337: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18662 1726867311.79339: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 18662 1726867311.79439: in VariableManager get_vars() 18662 1726867311.79454: done with get_vars() 18662 1726867311.79559: done processing included file 18662 1726867311.79561: iterating over new_blocks loaded from include file 18662 1726867311.79562: in VariableManager get_vars() 18662 1726867311.79574: done with get_vars() 18662 1726867311.79576: filtering new block on tags 18662 1726867311.79593: done filtering new block on tags 18662 1726867311.79596: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 18662 1726867311.79599: extending task lists for all hosts with included blocks 18662 1726867311.79997: done extending task lists 18662 1726867311.79998: done processing included files 18662 1726867311.79999: results queue empty 18662 1726867311.80000: checking for any_errors_fatal 18662 1726867311.80002: done checking for any_errors_fatal 18662 1726867311.80003: checking for max_fail_percentage 18662 1726867311.80004: done checking for max_fail_percentage 18662 1726867311.80005: checking to see if all hosts have failed and the running result is not ok 18662 1726867311.80006: done checking to see if all hosts have failed 18662 1726867311.80006: getting the remaining hosts for this loop 18662 1726867311.80007: done getting the remaining hosts for this loop 18662 1726867311.80010: getting the next task for host managed_node2 18662 1726867311.80013: done getting next task for host managed_node2 18662 1726867311.80015: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 18662 1726867311.80018: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867311.80020: getting variables 18662 1726867311.80021: in VariableManager get_vars() 18662 1726867311.80028: Calling all_inventory to load vars for managed_node2 18662 1726867311.80030: Calling groups_inventory to load vars for managed_node2 18662 1726867311.80033: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.80037: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.80039: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.80042: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.80205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.80397: done with get_vars() 18662 1726867311.80408: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 17:21:51 -0400 (0:00:00.028) 0:00:06.440 ****** 18662 1726867311.80472: entering _queue_task() for managed_node2/include_tasks 18662 1726867311.80685: worker is 1 (out of 1 available) 18662 1726867311.80696: exiting _queue_task() for managed_node2/include_tasks 18662 1726867311.80707: done queuing things up, now waiting for results queue to drain 18662 1726867311.80708: waiting for pending results... 18662 1726867311.80940: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 18662 1726867311.81026: in run() - task 0affcac9-a3a5-efab-a8ce-00000000015c 18662 1726867311.81042: variable 'ansible_search_path' from source: unknown 18662 1726867311.81049: variable 'ansible_search_path' from source: unknown 18662 1726867311.81094: calling self._execute() 18662 1726867311.81168: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.81282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.81286: variable 'omit' from source: magic vars 18662 1726867311.81545: variable 'ansible_distribution_major_version' from source: facts 18662 1726867311.81561: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867311.81570: _execute() done 18662 1726867311.81580: dumping result to json 18662 1726867311.81587: done dumping result, returning 18662 1726867311.81602: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0affcac9-a3a5-efab-a8ce-00000000015c] 18662 1726867311.81614: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000015c 18662 1726867311.81748: no more pending results, returning what we have 18662 1726867311.81753: in VariableManager get_vars() 18662 1726867311.81787: Calling all_inventory to load vars for managed_node2 18662 1726867311.81790: Calling groups_inventory to load vars for managed_node2 18662 1726867311.81793: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.81807: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.81810: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.81815: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.82151: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000015c 18662 1726867311.82154: WORKER PROCESS EXITING 18662 1726867311.82175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.82395: done with get_vars() 18662 1726867311.82402: variable 'ansible_search_path' from source: unknown 18662 1726867311.82403: variable 'ansible_search_path' from source: unknown 18662 1726867311.82457: we have included files to process 18662 1726867311.82458: generating all_blocks data 18662 1726867311.82460: done generating all_blocks data 18662 1726867311.82461: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18662 1726867311.82462: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18662 1726867311.82464: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 18662 1726867311.82711: done processing included file 18662 1726867311.82713: iterating over new_blocks loaded from include file 18662 1726867311.82715: in VariableManager get_vars() 18662 1726867311.82727: done with get_vars() 18662 1726867311.82729: filtering new block on tags 18662 1726867311.82745: done filtering new block on tags 18662 1726867311.82747: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 18662 1726867311.82751: extending task lists for all hosts with included blocks 18662 1726867311.82899: done extending task lists 18662 1726867311.82901: done processing included files 18662 1726867311.82901: results queue empty 18662 1726867311.82902: checking for any_errors_fatal 18662 1726867311.82908: done checking for any_errors_fatal 18662 1726867311.82909: checking for max_fail_percentage 18662 1726867311.82910: done checking for max_fail_percentage 18662 1726867311.82911: checking to see if all hosts have failed and the running result is not ok 18662 1726867311.82912: done checking to see if all hosts have failed 18662 1726867311.82912: getting the remaining hosts for this loop 18662 1726867311.82914: done getting the remaining hosts for this loop 18662 1726867311.82916: getting the next task for host managed_node2 18662 1726867311.82920: done getting next task for host managed_node2 18662 1726867311.82922: ^ task is: TASK: Gather current interface info 18662 1726867311.82925: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867311.82927: getting variables 18662 1726867311.82928: in VariableManager get_vars() 18662 1726867311.82936: Calling all_inventory to load vars for managed_node2 18662 1726867311.82937: Calling groups_inventory to load vars for managed_node2 18662 1726867311.82940: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867311.82944: Calling all_plugins_play to load vars for managed_node2 18662 1726867311.82946: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867311.82949: Calling groups_plugins_play to load vars for managed_node2 18662 1726867311.83087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867311.83284: done with get_vars() 18662 1726867311.83292: done getting variables 18662 1726867311.83327: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 17:21:51 -0400 (0:00:00.028) 0:00:06.469 ****** 18662 1726867311.83358: entering _queue_task() for managed_node2/command 18662 1726867311.83765: worker is 1 (out of 1 available) 18662 1726867311.83773: exiting _queue_task() for managed_node2/command 18662 1726867311.83784: done queuing things up, now waiting for results queue to drain 18662 1726867311.83785: waiting for pending results... 18662 1726867311.83805: running TaskExecutor() for managed_node2/TASK: Gather current interface info 18662 1726867311.83913: in run() - task 0affcac9-a3a5-efab-a8ce-000000000193 18662 1726867311.83932: variable 'ansible_search_path' from source: unknown 18662 1726867311.83940: variable 'ansible_search_path' from source: unknown 18662 1726867311.83974: calling self._execute() 18662 1726867311.84051: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.84064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.84079: variable 'omit' from source: magic vars 18662 1726867311.84480: variable 'ansible_distribution_major_version' from source: facts 18662 1726867311.84496: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867311.84505: variable 'omit' from source: magic vars 18662 1726867311.84565: variable 'omit' from source: magic vars 18662 1726867311.84605: variable 'omit' from source: magic vars 18662 1726867311.84646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867311.84692: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867311.84716: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867311.84736: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867311.84750: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867311.84790: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867311.84800: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.84808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.84915: Set connection var ansible_timeout to 10 18662 1726867311.84923: Set connection var ansible_connection to ssh 18662 1726867311.84933: Set connection var ansible_shell_executable to /bin/sh 18662 1726867311.84939: Set connection var ansible_shell_type to sh 18662 1726867311.84952: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867311.84960: Set connection var ansible_pipelining to False 18662 1726867311.84993: variable 'ansible_shell_executable' from source: unknown 18662 1726867311.85001: variable 'ansible_connection' from source: unknown 18662 1726867311.85008: variable 'ansible_module_compression' from source: unknown 18662 1726867311.85014: variable 'ansible_shell_type' from source: unknown 18662 1726867311.85020: variable 'ansible_shell_executable' from source: unknown 18662 1726867311.85026: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867311.85032: variable 'ansible_pipelining' from source: unknown 18662 1726867311.85038: variable 'ansible_timeout' from source: unknown 18662 1726867311.85044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867311.85181: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867311.85210: variable 'omit' from source: magic vars 18662 1726867311.85215: starting attempt loop 18662 1726867311.85282: running the handler 18662 1726867311.85285: _low_level_execute_command(): starting 18662 1726867311.85287: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867311.86084: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867311.86094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867311.86118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867311.86201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867311.87915: stdout chunk (state=3): >>>/root <<< 18662 1726867311.88037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867311.88048: stdout chunk (state=3): >>><<< 18662 1726867311.88060: stderr chunk (state=3): >>><<< 18662 1726867311.88140: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867311.88143: _low_level_execute_command(): starting 18662 1726867311.88148: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867311.8808706-18964-225049918512645 `" && echo ansible-tmp-1726867311.8808706-18964-225049918512645="` echo /root/.ansible/tmp/ansible-tmp-1726867311.8808706-18964-225049918512645 `" ) && sleep 0' 18662 1726867311.88742: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867311.88759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867311.88772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867311.88793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867311.88849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867311.88915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867311.88932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867311.88961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867311.89026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867311.91010: stdout chunk (state=3): >>>ansible-tmp-1726867311.8808706-18964-225049918512645=/root/.ansible/tmp/ansible-tmp-1726867311.8808706-18964-225049918512645 <<< 18662 1726867311.91149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867311.91173: stderr chunk (state=3): >>><<< 18662 1726867311.91176: stdout chunk (state=3): >>><<< 18662 1726867311.91214: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867311.8808706-18964-225049918512645=/root/.ansible/tmp/ansible-tmp-1726867311.8808706-18964-225049918512645 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867311.91589: variable 'ansible_module_compression' from source: unknown 18662 1726867311.91593: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18662 1726867311.91596: variable 'ansible_facts' from source: unknown 18662 1726867311.91704: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867311.8808706-18964-225049918512645/AnsiballZ_command.py 18662 1726867311.91925: Sending initial data 18662 1726867311.91933: Sent initial data (156 bytes) 18662 1726867311.92467: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867311.92495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867311.92575: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867311.92608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867311.92625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867311.92649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867311.92722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867311.94302: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867311.94337: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867311.94373: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmp7uc80dp9 /root/.ansible/tmp/ansible-tmp-1726867311.8808706-18964-225049918512645/AnsiballZ_command.py <<< 18662 1726867311.94385: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867311.8808706-18964-225049918512645/AnsiballZ_command.py" <<< 18662 1726867311.94440: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmp7uc80dp9" to remote "/root/.ansible/tmp/ansible-tmp-1726867311.8808706-18964-225049918512645/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867311.8808706-18964-225049918512645/AnsiballZ_command.py" <<< 18662 1726867311.95185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867311.95226: stderr chunk (state=3): >>><<< 18662 1726867311.95231: stdout chunk (state=3): >>><<< 18662 1726867311.95273: done transferring module to remote 18662 1726867311.95283: _low_level_execute_command(): starting 18662 1726867311.95288: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867311.8808706-18964-225049918512645/ /root/.ansible/tmp/ansible-tmp-1726867311.8808706-18964-225049918512645/AnsiballZ_command.py && sleep 0' 18662 1726867311.95697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867311.95700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867311.95703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867311.95705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867311.95753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867311.95760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867311.95804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867311.97603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867311.97625: stderr chunk (state=3): >>><<< 18662 1726867311.97628: stdout chunk (state=3): >>><<< 18662 1726867311.97641: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867311.97644: _low_level_execute_command(): starting 18662 1726867311.97647: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867311.8808706-18964-225049918512645/AnsiballZ_command.py && sleep 0' 18662 1726867311.98046: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867311.98049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867311.98051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867311.98054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867311.98102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867311.98106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867311.98155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867312.14017: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:21:52.135750", "end": "2024-09-20 17:21:52.139100", "delta": "0:00:00.003350", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18662 1726867312.15622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867312.15650: stderr chunk (state=3): >>><<< 18662 1726867312.15654: stdout chunk (state=3): >>><<< 18662 1726867312.15670: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:21:52.135750", "end": "2024-09-20 17:21:52.139100", "delta": "0:00:00.003350", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867312.15701: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867311.8808706-18964-225049918512645/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867312.15711: _low_level_execute_command(): starting 18662 1726867312.15714: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867311.8808706-18964-225049918512645/ > /dev/null 2>&1 && sleep 0' 18662 1726867312.16253: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867312.16257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867312.16274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867312.16331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867312.16341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867312.16344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867312.16383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867312.18393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867312.18397: stdout chunk (state=3): >>><<< 18662 1726867312.18399: stderr chunk (state=3): >>><<< 18662 1726867312.18402: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867312.18404: handler run complete 18662 1726867312.18406: Evaluated conditional (False): False 18662 1726867312.18411: attempt loop complete, returning result 18662 1726867312.18413: _execute() done 18662 1726867312.18415: dumping result to json 18662 1726867312.18417: done dumping result, returning 18662 1726867312.18419: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0affcac9-a3a5-efab-a8ce-000000000193] 18662 1726867312.18421: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000193 ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003350", "end": "2024-09-20 17:21:52.139100", "rc": 0, "start": "2024-09-20 17:21:52.135750" } STDOUT: bonding_masters eth0 lo 18662 1726867312.18731: no more pending results, returning what we have 18662 1726867312.18736: results queue empty 18662 1726867312.18737: checking for any_errors_fatal 18662 1726867312.18739: done checking for any_errors_fatal 18662 1726867312.18739: checking for max_fail_percentage 18662 1726867312.18741: done checking for max_fail_percentage 18662 1726867312.18742: checking to see if all hosts have failed and the running result is not ok 18662 1726867312.18742: done checking to see if all hosts have failed 18662 1726867312.18743: getting the remaining hosts for this loop 18662 1726867312.18744: done getting the remaining hosts for this loop 18662 1726867312.18747: getting the next task for host managed_node2 18662 1726867312.18756: done getting next task for host managed_node2 18662 1726867312.18758: ^ task is: TASK: Set current_interfaces 18662 1726867312.18762: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867312.18766: getting variables 18662 1726867312.18767: in VariableManager get_vars() 18662 1726867312.19028: Calling all_inventory to load vars for managed_node2 18662 1726867312.19031: Calling groups_inventory to load vars for managed_node2 18662 1726867312.19034: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867312.19044: Calling all_plugins_play to load vars for managed_node2 18662 1726867312.19048: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867312.19051: Calling groups_plugins_play to load vars for managed_node2 18662 1726867312.19347: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000193 18662 1726867312.19350: WORKER PROCESS EXITING 18662 1726867312.19360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867312.19483: done with get_vars() 18662 1726867312.19491: done getting variables 18662 1726867312.19534: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 17:21:52 -0400 (0:00:00.361) 0:00:06.831 ****** 18662 1726867312.19556: entering _queue_task() for managed_node2/set_fact 18662 1726867312.19756: worker is 1 (out of 1 available) 18662 1726867312.19769: exiting _queue_task() for managed_node2/set_fact 18662 1726867312.19782: done queuing things up, now waiting for results queue to drain 18662 1726867312.19783: waiting for pending results... 18662 1726867312.19931: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 18662 1726867312.20005: in run() - task 0affcac9-a3a5-efab-a8ce-000000000194 18662 1726867312.20018: variable 'ansible_search_path' from source: unknown 18662 1726867312.20021: variable 'ansible_search_path' from source: unknown 18662 1726867312.20048: calling self._execute() 18662 1726867312.20104: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867312.20107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867312.20126: variable 'omit' from source: magic vars 18662 1726867312.20381: variable 'ansible_distribution_major_version' from source: facts 18662 1726867312.20390: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867312.20394: variable 'omit' from source: magic vars 18662 1726867312.20430: variable 'omit' from source: magic vars 18662 1726867312.20506: variable '_current_interfaces' from source: set_fact 18662 1726867312.20555: variable 'omit' from source: magic vars 18662 1726867312.20587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867312.20615: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867312.20631: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867312.20644: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867312.20652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867312.20679: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867312.20683: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867312.20685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867312.20752: Set connection var ansible_timeout to 10 18662 1726867312.20756: Set connection var ansible_connection to ssh 18662 1726867312.20758: Set connection var ansible_shell_executable to /bin/sh 18662 1726867312.20761: Set connection var ansible_shell_type to sh 18662 1726867312.20769: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867312.20775: Set connection var ansible_pipelining to False 18662 1726867312.20796: variable 'ansible_shell_executable' from source: unknown 18662 1726867312.20800: variable 'ansible_connection' from source: unknown 18662 1726867312.20802: variable 'ansible_module_compression' from source: unknown 18662 1726867312.20805: variable 'ansible_shell_type' from source: unknown 18662 1726867312.20807: variable 'ansible_shell_executable' from source: unknown 18662 1726867312.20809: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867312.20814: variable 'ansible_pipelining' from source: unknown 18662 1726867312.20817: variable 'ansible_timeout' from source: unknown 18662 1726867312.20820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867312.20921: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867312.20929: variable 'omit' from source: magic vars 18662 1726867312.20934: starting attempt loop 18662 1726867312.20937: running the handler 18662 1726867312.20946: handler run complete 18662 1726867312.20954: attempt loop complete, returning result 18662 1726867312.20956: _execute() done 18662 1726867312.20959: dumping result to json 18662 1726867312.20961: done dumping result, returning 18662 1726867312.20969: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0affcac9-a3a5-efab-a8ce-000000000194] 18662 1726867312.20971: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000194 18662 1726867312.21046: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000194 18662 1726867312.21049: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 18662 1726867312.21108: no more pending results, returning what we have 18662 1726867312.21111: results queue empty 18662 1726867312.21112: checking for any_errors_fatal 18662 1726867312.21117: done checking for any_errors_fatal 18662 1726867312.21118: checking for max_fail_percentage 18662 1726867312.21120: done checking for max_fail_percentage 18662 1726867312.21120: checking to see if all hosts have failed and the running result is not ok 18662 1726867312.21121: done checking to see if all hosts have failed 18662 1726867312.21121: getting the remaining hosts for this loop 18662 1726867312.21123: done getting the remaining hosts for this loop 18662 1726867312.21126: getting the next task for host managed_node2 18662 1726867312.21133: done getting next task for host managed_node2 18662 1726867312.21135: ^ task is: TASK: Show current_interfaces 18662 1726867312.21138: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867312.21141: getting variables 18662 1726867312.21142: in VariableManager get_vars() 18662 1726867312.21171: Calling all_inventory to load vars for managed_node2 18662 1726867312.21173: Calling groups_inventory to load vars for managed_node2 18662 1726867312.21176: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867312.21186: Calling all_plugins_play to load vars for managed_node2 18662 1726867312.21188: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867312.21191: Calling groups_plugins_play to load vars for managed_node2 18662 1726867312.21358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867312.21617: done with get_vars() 18662 1726867312.21626: done getting variables 18662 1726867312.21680: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 17:21:52 -0400 (0:00:00.021) 0:00:06.852 ****** 18662 1726867312.21707: entering _queue_task() for managed_node2/debug 18662 1726867312.22184: worker is 1 (out of 1 available) 18662 1726867312.22190: exiting _queue_task() for managed_node2/debug 18662 1726867312.22199: done queuing things up, now waiting for results queue to drain 18662 1726867312.22200: waiting for pending results... 18662 1726867312.22270: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 18662 1726867312.22321: in run() - task 0affcac9-a3a5-efab-a8ce-00000000015d 18662 1726867312.22325: variable 'ansible_search_path' from source: unknown 18662 1726867312.22328: variable 'ansible_search_path' from source: unknown 18662 1726867312.22360: calling self._execute() 18662 1726867312.22443: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867312.22455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867312.22537: variable 'omit' from source: magic vars 18662 1726867312.22854: variable 'ansible_distribution_major_version' from source: facts 18662 1726867312.22880: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867312.22890: variable 'omit' from source: magic vars 18662 1726867312.22939: variable 'omit' from source: magic vars 18662 1726867312.23043: variable 'current_interfaces' from source: set_fact 18662 1726867312.23081: variable 'omit' from source: magic vars 18662 1726867312.23460: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867312.23464: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867312.23466: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867312.23468: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867312.23470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867312.23472: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867312.23473: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867312.23475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867312.23566: Set connection var ansible_timeout to 10 18662 1726867312.23569: Set connection var ansible_connection to ssh 18662 1726867312.23572: Set connection var ansible_shell_executable to /bin/sh 18662 1726867312.23574: Set connection var ansible_shell_type to sh 18662 1726867312.23576: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867312.23580: Set connection var ansible_pipelining to False 18662 1726867312.23592: variable 'ansible_shell_executable' from source: unknown 18662 1726867312.23599: variable 'ansible_connection' from source: unknown 18662 1726867312.23606: variable 'ansible_module_compression' from source: unknown 18662 1726867312.23612: variable 'ansible_shell_type' from source: unknown 18662 1726867312.23618: variable 'ansible_shell_executable' from source: unknown 18662 1726867312.23624: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867312.23631: variable 'ansible_pipelining' from source: unknown 18662 1726867312.23636: variable 'ansible_timeout' from source: unknown 18662 1726867312.23643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867312.23784: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867312.23797: variable 'omit' from source: magic vars 18662 1726867312.23802: starting attempt loop 18662 1726867312.23804: running the handler 18662 1726867312.23982: handler run complete 18662 1726867312.23986: attempt loop complete, returning result 18662 1726867312.23988: _execute() done 18662 1726867312.23990: dumping result to json 18662 1726867312.23992: done dumping result, returning 18662 1726867312.23995: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0affcac9-a3a5-efab-a8ce-00000000015d] 18662 1726867312.23997: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000015d 18662 1726867312.24064: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000015d 18662 1726867312.24068: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 18662 1726867312.24113: no more pending results, returning what we have 18662 1726867312.24116: results queue empty 18662 1726867312.24116: checking for any_errors_fatal 18662 1726867312.24121: done checking for any_errors_fatal 18662 1726867312.24121: checking for max_fail_percentage 18662 1726867312.24123: done checking for max_fail_percentage 18662 1726867312.24124: checking to see if all hosts have failed and the running result is not ok 18662 1726867312.24124: done checking to see if all hosts have failed 18662 1726867312.24125: getting the remaining hosts for this loop 18662 1726867312.24126: done getting the remaining hosts for this loop 18662 1726867312.24129: getting the next task for host managed_node2 18662 1726867312.24135: done getting next task for host managed_node2 18662 1726867312.24137: ^ task is: TASK: Install iproute 18662 1726867312.24139: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867312.24142: getting variables 18662 1726867312.24144: in VariableManager get_vars() 18662 1726867312.24165: Calling all_inventory to load vars for managed_node2 18662 1726867312.24167: Calling groups_inventory to load vars for managed_node2 18662 1726867312.24170: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867312.24179: Calling all_plugins_play to load vars for managed_node2 18662 1726867312.24182: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867312.24185: Calling groups_plugins_play to load vars for managed_node2 18662 1726867312.24350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867312.24556: done with get_vars() 18662 1726867312.24564: done getting variables 18662 1726867312.24620: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 17:21:52 -0400 (0:00:00.029) 0:00:06.881 ****** 18662 1726867312.24646: entering _queue_task() for managed_node2/package 18662 1726867312.25008: worker is 1 (out of 1 available) 18662 1726867312.25017: exiting _queue_task() for managed_node2/package 18662 1726867312.25026: done queuing things up, now waiting for results queue to drain 18662 1726867312.25027: waiting for pending results... 18662 1726867312.25121: running TaskExecutor() for managed_node2/TASK: Install iproute 18662 1726867312.25218: in run() - task 0affcac9-a3a5-efab-a8ce-000000000134 18662 1726867312.25237: variable 'ansible_search_path' from source: unknown 18662 1726867312.25244: variable 'ansible_search_path' from source: unknown 18662 1726867312.25292: calling self._execute() 18662 1726867312.25371: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867312.25441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867312.25445: variable 'omit' from source: magic vars 18662 1726867312.25744: variable 'ansible_distribution_major_version' from source: facts 18662 1726867312.25760: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867312.25769: variable 'omit' from source: magic vars 18662 1726867312.25819: variable 'omit' from source: magic vars 18662 1726867312.26013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867312.28396: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867312.28399: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867312.28433: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867312.28472: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867312.28524: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867312.28625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867312.28658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867312.28690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867312.28742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867312.28763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867312.28866: variable '__network_is_ostree' from source: set_fact 18662 1726867312.28878: variable 'omit' from source: magic vars 18662 1726867312.28940: variable 'omit' from source: magic vars 18662 1726867312.28943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867312.28972: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867312.28998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867312.29021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867312.29035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867312.29157: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867312.29161: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867312.29163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867312.29191: Set connection var ansible_timeout to 10 18662 1726867312.29200: Set connection var ansible_connection to ssh 18662 1726867312.29211: Set connection var ansible_shell_executable to /bin/sh 18662 1726867312.29219: Set connection var ansible_shell_type to sh 18662 1726867312.29234: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867312.29243: Set connection var ansible_pipelining to False 18662 1726867312.29280: variable 'ansible_shell_executable' from source: unknown 18662 1726867312.29289: variable 'ansible_connection' from source: unknown 18662 1726867312.29296: variable 'ansible_module_compression' from source: unknown 18662 1726867312.29303: variable 'ansible_shell_type' from source: unknown 18662 1726867312.29310: variable 'ansible_shell_executable' from source: unknown 18662 1726867312.29317: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867312.29325: variable 'ansible_pipelining' from source: unknown 18662 1726867312.29332: variable 'ansible_timeout' from source: unknown 18662 1726867312.29340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867312.29447: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867312.29462: variable 'omit' from source: magic vars 18662 1726867312.29485: starting attempt loop 18662 1726867312.29488: running the handler 18662 1726867312.29592: variable 'ansible_facts' from source: unknown 18662 1726867312.29596: variable 'ansible_facts' from source: unknown 18662 1726867312.29598: _low_level_execute_command(): starting 18662 1726867312.29600: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867312.30284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867312.30311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867312.30336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867312.30372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867312.30418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867312.32098: stdout chunk (state=3): >>>/root <<< 18662 1726867312.32197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867312.32288: stderr chunk (state=3): >>><<< 18662 1726867312.32291: stdout chunk (state=3): >>><<< 18662 1726867312.32294: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867312.32304: _low_level_execute_command(): starting 18662 1726867312.32308: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867312.3225496-18991-280023819844706 `" && echo ansible-tmp-1726867312.3225496-18991-280023819844706="` echo /root/.ansible/tmp/ansible-tmp-1726867312.3225496-18991-280023819844706 `" ) && sleep 0' 18662 1726867312.32967: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867312.32991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867312.33009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867312.33029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867312.33047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867312.33064: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867312.33169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867312.33193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867312.33210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867312.33350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867312.35207: stdout chunk (state=3): >>>ansible-tmp-1726867312.3225496-18991-280023819844706=/root/.ansible/tmp/ansible-tmp-1726867312.3225496-18991-280023819844706 <<< 18662 1726867312.35313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867312.35362: stderr chunk (state=3): >>><<< 18662 1726867312.35381: stdout chunk (state=3): >>><<< 18662 1726867312.35403: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867312.3225496-18991-280023819844706=/root/.ansible/tmp/ansible-tmp-1726867312.3225496-18991-280023819844706 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867312.35483: variable 'ansible_module_compression' from source: unknown 18662 1726867312.35512: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 18662 1726867312.35523: ANSIBALLZ: Acquiring lock 18662 1726867312.35533: ANSIBALLZ: Lock acquired: 140264020905808 18662 1726867312.35542: ANSIBALLZ: Creating module 18662 1726867312.47348: ANSIBALLZ: Writing module into payload 18662 1726867312.47486: ANSIBALLZ: Writing module 18662 1726867312.47502: ANSIBALLZ: Renaming module 18662 1726867312.47516: ANSIBALLZ: Done creating module 18662 1726867312.47530: variable 'ansible_facts' from source: unknown 18662 1726867312.47589: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867312.3225496-18991-280023819844706/AnsiballZ_dnf.py 18662 1726867312.47689: Sending initial data 18662 1726867312.47692: Sent initial data (152 bytes) 18662 1726867312.48106: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867312.48110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867312.48113: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867312.48115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 18662 1726867312.48118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867312.48168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867312.48171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867312.48174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867312.48222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867312.49874: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18662 1726867312.49880: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867312.49912: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867312.49956: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpi3afbxzo /root/.ansible/tmp/ansible-tmp-1726867312.3225496-18991-280023819844706/AnsiballZ_dnf.py <<< 18662 1726867312.49959: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867312.3225496-18991-280023819844706/AnsiballZ_dnf.py" <<< 18662 1726867312.49994: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpi3afbxzo" to remote "/root/.ansible/tmp/ansible-tmp-1726867312.3225496-18991-280023819844706/AnsiballZ_dnf.py" <<< 18662 1726867312.49997: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867312.3225496-18991-280023819844706/AnsiballZ_dnf.py" <<< 18662 1726867312.50654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867312.50689: stderr chunk (state=3): >>><<< 18662 1726867312.50692: stdout chunk (state=3): >>><<< 18662 1726867312.50739: done transferring module to remote 18662 1726867312.50747: _low_level_execute_command(): starting 18662 1726867312.50751: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867312.3225496-18991-280023819844706/ /root/.ansible/tmp/ansible-tmp-1726867312.3225496-18991-280023819844706/AnsiballZ_dnf.py && sleep 0' 18662 1726867312.51138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867312.51174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867312.51179: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867312.51182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 18662 1726867312.51184: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867312.51186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867312.51231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867312.51234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867312.51239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867312.51287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867312.53101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867312.53121: stderr chunk (state=3): >>><<< 18662 1726867312.53124: stdout chunk (state=3): >>><<< 18662 1726867312.53136: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867312.53138: _low_level_execute_command(): starting 18662 1726867312.53143: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867312.3225496-18991-280023819844706/AnsiballZ_dnf.py && sleep 0' 18662 1726867312.53532: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867312.53536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867312.53538: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867312.53541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867312.53589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867312.53596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867312.53638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867312.95636: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 18662 1726867313.00385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867313.00389: stdout chunk (state=3): >>><<< 18662 1726867313.00391: stderr chunk (state=3): >>><<< 18662 1726867313.00393: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867313.00718: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867312.3225496-18991-280023819844706/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867313.00722: _low_level_execute_command(): starting 18662 1726867313.00724: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867312.3225496-18991-280023819844706/ > /dev/null 2>&1 && sleep 0' 18662 1726867313.01632: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867313.01983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867313.01993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.02057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867313.03903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867313.03944: stderr chunk (state=3): >>><<< 18662 1726867313.04093: stdout chunk (state=3): >>><<< 18662 1726867313.04113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867313.04116: handler run complete 18662 1726867313.04383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867313.04442: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867313.04482: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867313.04779: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867313.04811: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867313.04881: variable '__install_status' from source: unknown 18662 1726867313.04934: Evaluated conditional (__install_status is success): True 18662 1726867313.04937: attempt loop complete, returning result 18662 1726867313.04939: _execute() done 18662 1726867313.04941: dumping result to json 18662 1726867313.04943: done dumping result, returning 18662 1726867313.04945: done running TaskExecutor() for managed_node2/TASK: Install iproute [0affcac9-a3a5-efab-a8ce-000000000134] 18662 1726867313.04947: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000134 18662 1726867313.05035: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000134 18662 1726867313.05040: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 18662 1726867313.05323: no more pending results, returning what we have 18662 1726867313.05327: results queue empty 18662 1726867313.05328: checking for any_errors_fatal 18662 1726867313.05332: done checking for any_errors_fatal 18662 1726867313.05333: checking for max_fail_percentage 18662 1726867313.05334: done checking for max_fail_percentage 18662 1726867313.05335: checking to see if all hosts have failed and the running result is not ok 18662 1726867313.05336: done checking to see if all hosts have failed 18662 1726867313.05336: getting the remaining hosts for this loop 18662 1726867313.05338: done getting the remaining hosts for this loop 18662 1726867313.05341: getting the next task for host managed_node2 18662 1726867313.05347: done getting next task for host managed_node2 18662 1726867313.05350: ^ task is: TASK: Create veth interface {{ interface }} 18662 1726867313.05352: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867313.05356: getting variables 18662 1726867313.05357: in VariableManager get_vars() 18662 1726867313.05387: Calling all_inventory to load vars for managed_node2 18662 1726867313.05390: Calling groups_inventory to load vars for managed_node2 18662 1726867313.05393: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867313.05403: Calling all_plugins_play to load vars for managed_node2 18662 1726867313.05406: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867313.05412: Calling groups_plugins_play to load vars for managed_node2 18662 1726867313.06500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867313.06887: done with get_vars() 18662 1726867313.06897: done getting variables 18662 1726867313.06959: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18662 1726867313.07072: variable 'interface' from source: set_fact TASK [Create veth interface lsr27] ********************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 17:21:53 -0400 (0:00:00.826) 0:00:07.708 ****** 18662 1726867313.07312: entering _queue_task() for managed_node2/command 18662 1726867313.07799: worker is 1 (out of 1 available) 18662 1726867313.07814: exiting _queue_task() for managed_node2/command 18662 1726867313.07826: done queuing things up, now waiting for results queue to drain 18662 1726867313.07828: waiting for pending results... 18662 1726867313.08295: running TaskExecutor() for managed_node2/TASK: Create veth interface lsr27 18662 1726867313.08466: in run() - task 0affcac9-a3a5-efab-a8ce-000000000135 18662 1726867313.08501: variable 'ansible_search_path' from source: unknown 18662 1726867313.08505: variable 'ansible_search_path' from source: unknown 18662 1726867313.08922: variable 'interface' from source: set_fact 18662 1726867313.09383: variable 'interface' from source: set_fact 18662 1726867313.09387: variable 'interface' from source: set_fact 18662 1726867313.09782: Loaded config def from plugin (lookup/items) 18662 1726867313.09785: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 18662 1726867313.09788: variable 'omit' from source: magic vars 18662 1726867313.09790: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867313.09793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867313.09795: variable 'omit' from source: magic vars 18662 1726867313.10582: variable 'ansible_distribution_major_version' from source: facts 18662 1726867313.10585: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867313.10789: variable 'type' from source: set_fact 18662 1726867313.10799: variable 'state' from source: include params 18662 1726867313.10808: variable 'interface' from source: set_fact 18662 1726867313.10818: variable 'current_interfaces' from source: set_fact 18662 1726867313.10829: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 18662 1726867313.10839: variable 'omit' from source: magic vars 18662 1726867313.10880: variable 'omit' from source: magic vars 18662 1726867313.10927: variable 'item' from source: unknown 18662 1726867313.11382: variable 'item' from source: unknown 18662 1726867313.11385: variable 'omit' from source: magic vars 18662 1726867313.11388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867313.11391: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867313.11394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867313.11396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867313.11398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867313.11400: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867313.11403: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867313.11406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867313.11648: Set connection var ansible_timeout to 10 18662 1726867313.11660: Set connection var ansible_connection to ssh 18662 1726867313.11671: Set connection var ansible_shell_executable to /bin/sh 18662 1726867313.11681: Set connection var ansible_shell_type to sh 18662 1726867313.11697: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867313.11707: Set connection var ansible_pipelining to False 18662 1726867313.11733: variable 'ansible_shell_executable' from source: unknown 18662 1726867313.11741: variable 'ansible_connection' from source: unknown 18662 1726867313.11749: variable 'ansible_module_compression' from source: unknown 18662 1726867313.11757: variable 'ansible_shell_type' from source: unknown 18662 1726867313.11765: variable 'ansible_shell_executable' from source: unknown 18662 1726867313.11772: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867313.12082: variable 'ansible_pipelining' from source: unknown 18662 1726867313.12086: variable 'ansible_timeout' from source: unknown 18662 1726867313.12090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867313.12135: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867313.12151: variable 'omit' from source: magic vars 18662 1726867313.12161: starting attempt loop 18662 1726867313.12169: running the handler 18662 1726867313.12190: _low_level_execute_command(): starting 18662 1726867313.12205: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867313.13481: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867313.13783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867313.13864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.14008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867313.15732: stdout chunk (state=3): >>>/root <<< 18662 1726867313.15835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867313.15917: stderr chunk (state=3): >>><<< 18662 1726867313.15927: stdout chunk (state=3): >>><<< 18662 1726867313.15959: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867313.16036: _low_level_execute_command(): starting 18662 1726867313.16048: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867313.1601996-19019-237292291435697 `" && echo ansible-tmp-1726867313.1601996-19019-237292291435697="` echo /root/.ansible/tmp/ansible-tmp-1726867313.1601996-19019-237292291435697 `" ) && sleep 0' 18662 1726867313.16914: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867313.16929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867313.16944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867313.16963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867313.16984: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867313.16997: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867313.17011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867313.17094: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867313.17114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867313.17133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867313.17151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.17213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867313.19412: stdout chunk (state=3): >>>ansible-tmp-1726867313.1601996-19019-237292291435697=/root/.ansible/tmp/ansible-tmp-1726867313.1601996-19019-237292291435697 <<< 18662 1726867313.19562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867313.19566: stdout chunk (state=3): >>><<< 18662 1726867313.19569: stderr chunk (state=3): >>><<< 18662 1726867313.19589: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867313.1601996-19019-237292291435697=/root/.ansible/tmp/ansible-tmp-1726867313.1601996-19019-237292291435697 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867313.19687: variable 'ansible_module_compression' from source: unknown 18662 1726867313.19716: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18662 1726867313.19753: variable 'ansible_facts' from source: unknown 18662 1726867313.19881: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867313.1601996-19019-237292291435697/AnsiballZ_command.py 18662 1726867313.20008: Sending initial data 18662 1726867313.20092: Sent initial data (156 bytes) 18662 1726867313.20587: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867313.20600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867313.20615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867313.20633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867313.20729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867313.20748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.20807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867313.22473: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867313.22527: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867313.22686: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpf8bnwz0u /root/.ansible/tmp/ansible-tmp-1726867313.1601996-19019-237292291435697/AnsiballZ_command.py <<< 18662 1726867313.22689: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867313.1601996-19019-237292291435697/AnsiballZ_command.py" <<< 18662 1726867313.22692: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpf8bnwz0u" to remote "/root/.ansible/tmp/ansible-tmp-1726867313.1601996-19019-237292291435697/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867313.1601996-19019-237292291435697/AnsiballZ_command.py" <<< 18662 1726867313.23648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867313.23651: stderr chunk (state=3): >>><<< 18662 1726867313.23654: stdout chunk (state=3): >>><<< 18662 1726867313.23663: done transferring module to remote 18662 1726867313.23680: _low_level_execute_command(): starting 18662 1726867313.23690: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867313.1601996-19019-237292291435697/ /root/.ansible/tmp/ansible-tmp-1726867313.1601996-19019-237292291435697/AnsiballZ_command.py && sleep 0' 18662 1726867313.24300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 18662 1726867313.24388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867313.24415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867313.24432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.24498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867313.26640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867313.26644: stdout chunk (state=3): >>><<< 18662 1726867313.26646: stderr chunk (state=3): >>><<< 18662 1726867313.26648: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867313.26650: _low_level_execute_command(): starting 18662 1726867313.26653: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867313.1601996-19019-237292291435697/AnsiballZ_command.py && sleep 0' 18662 1726867313.27735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867313.27738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867313.27741: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867313.27743: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867313.27746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867313.27748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867313.27792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867313.27806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.27865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867313.44206: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-20 17:21:53.433957", "end": "2024-09-20 17:21:53.439247", "delta": "0:00:00.005290", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18662 1726867313.46990: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867313.46995: stdout chunk (state=3): >>><<< 18662 1726867313.46997: stderr chunk (state=3): >>><<< 18662 1726867313.47000: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-20 17:21:53.433957", "end": "2024-09-20 17:21:53.439247", "delta": "0:00:00.005290", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867313.47002: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr27 type veth peer name peerlsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867313.1601996-19019-237292291435697/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867313.47018: _low_level_execute_command(): starting 18662 1726867313.47028: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867313.1601996-19019-237292291435697/ > /dev/null 2>&1 && sleep 0' 18662 1726867313.47718: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867313.47734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867313.47772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867313.47789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867313.47891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867313.47918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867313.47931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.48121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867313.52357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867313.52360: stdout chunk (state=3): >>><<< 18662 1726867313.52363: stderr chunk (state=3): >>><<< 18662 1726867313.52583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867313.52587: handler run complete 18662 1726867313.52589: Evaluated conditional (False): False 18662 1726867313.52591: attempt loop complete, returning result 18662 1726867313.52593: variable 'item' from source: unknown 18662 1726867313.52595: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link add lsr27 type veth peer name peerlsr27) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27" ], "delta": "0:00:00.005290", "end": "2024-09-20 17:21:53.439247", "item": "ip link add lsr27 type veth peer name peerlsr27", "rc": 0, "start": "2024-09-20 17:21:53.433957" } 18662 1726867313.52962: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867313.52965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867313.52968: variable 'omit' from source: magic vars 18662 1726867313.52999: variable 'ansible_distribution_major_version' from source: facts 18662 1726867313.53014: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867313.53229: variable 'type' from source: set_fact 18662 1726867313.53239: variable 'state' from source: include params 18662 1726867313.53248: variable 'interface' from source: set_fact 18662 1726867313.53258: variable 'current_interfaces' from source: set_fact 18662 1726867313.53268: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 18662 1726867313.53284: variable 'omit' from source: magic vars 18662 1726867313.53311: variable 'omit' from source: magic vars 18662 1726867313.53357: variable 'item' from source: unknown 18662 1726867313.53505: variable 'item' from source: unknown 18662 1726867313.53511: variable 'omit' from source: magic vars 18662 1726867313.53513: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867313.53516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867313.53519: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867313.53521: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867313.53527: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867313.53535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867313.53629: Set connection var ansible_timeout to 10 18662 1726867313.53637: Set connection var ansible_connection to ssh 18662 1726867313.53648: Set connection var ansible_shell_executable to /bin/sh 18662 1726867313.53656: Set connection var ansible_shell_type to sh 18662 1726867313.53670: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867313.53682: Set connection var ansible_pipelining to False 18662 1726867313.53711: variable 'ansible_shell_executable' from source: unknown 18662 1726867313.53727: variable 'ansible_connection' from source: unknown 18662 1726867313.53834: variable 'ansible_module_compression' from source: unknown 18662 1726867313.53838: variable 'ansible_shell_type' from source: unknown 18662 1726867313.53840: variable 'ansible_shell_executable' from source: unknown 18662 1726867313.53842: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867313.53844: variable 'ansible_pipelining' from source: unknown 18662 1726867313.53846: variable 'ansible_timeout' from source: unknown 18662 1726867313.53848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867313.53880: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867313.53895: variable 'omit' from source: magic vars 18662 1726867313.53904: starting attempt loop 18662 1726867313.53914: running the handler 18662 1726867313.53926: _low_level_execute_command(): starting 18662 1726867313.53939: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867313.54573: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867313.54601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867313.54701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867313.54733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867313.54749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867313.54770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.54846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867313.56481: stdout chunk (state=3): >>>/root <<< 18662 1726867313.56636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867313.56640: stdout chunk (state=3): >>><<< 18662 1726867313.56642: stderr chunk (state=3): >>><<< 18662 1726867313.56737: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867313.56741: _low_level_execute_command(): starting 18662 1726867313.56743: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867313.566594-19019-214153566675487 `" && echo ansible-tmp-1726867313.566594-19019-214153566675487="` echo /root/.ansible/tmp/ansible-tmp-1726867313.566594-19019-214153566675487 `" ) && sleep 0' 18662 1726867313.57284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867313.57299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867313.57326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867313.57342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867313.57359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867313.57371: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867313.57394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867313.57436: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867313.57452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867313.57492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867313.57550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867313.57567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867313.57597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.57670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867313.59599: stdout chunk (state=3): >>>ansible-tmp-1726867313.566594-19019-214153566675487=/root/.ansible/tmp/ansible-tmp-1726867313.566594-19019-214153566675487 <<< 18662 1726867313.59767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867313.59771: stdout chunk (state=3): >>><<< 18662 1726867313.59773: stderr chunk (state=3): >>><<< 18662 1726867313.59791: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867313.566594-19019-214153566675487=/root/.ansible/tmp/ansible-tmp-1726867313.566594-19019-214153566675487 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867313.59823: variable 'ansible_module_compression' from source: unknown 18662 1726867313.59983: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18662 1726867313.59987: variable 'ansible_facts' from source: unknown 18662 1726867313.59989: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867313.566594-19019-214153566675487/AnsiballZ_command.py 18662 1726867313.60198: Sending initial data 18662 1726867313.60392: Sent initial data (155 bytes) 18662 1726867313.60725: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867313.60739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867313.60755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867313.60775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867313.60796: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867313.60897: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867313.60923: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.61004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867313.62604: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867313.62635: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867313.62688: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmprjzy4j46 /root/.ansible/tmp/ansible-tmp-1726867313.566594-19019-214153566675487/AnsiballZ_command.py <<< 18662 1726867313.62725: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867313.566594-19019-214153566675487/AnsiballZ_command.py" <<< 18662 1726867313.62729: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmprjzy4j46" to remote "/root/.ansible/tmp/ansible-tmp-1726867313.566594-19019-214153566675487/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867313.566594-19019-214153566675487/AnsiballZ_command.py" <<< 18662 1726867313.63496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867313.63529: stderr chunk (state=3): >>><<< 18662 1726867313.63532: stdout chunk (state=3): >>><<< 18662 1726867313.63581: done transferring module to remote 18662 1726867313.63596: _low_level_execute_command(): starting 18662 1726867313.63637: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867313.566594-19019-214153566675487/ /root/.ansible/tmp/ansible-tmp-1726867313.566594-19019-214153566675487/AnsiballZ_command.py && sleep 0' 18662 1726867313.64222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867313.64237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867313.64304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867313.64367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867313.64390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867313.64430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.64494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867313.66344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867313.66348: stdout chunk (state=3): >>><<< 18662 1726867313.66350: stderr chunk (state=3): >>><<< 18662 1726867313.66443: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867313.66447: _low_level_execute_command(): starting 18662 1726867313.66450: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867313.566594-19019-214153566675487/AnsiballZ_command.py && sleep 0' 18662 1726867313.67003: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867313.67028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867313.67043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867313.67059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867313.67079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867313.67093: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867313.67107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867313.67196: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867313.67236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867313.67280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.67324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867313.83345: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-20 17:21:53.828279", "end": "2024-09-20 17:21:53.832287", "delta": "0:00:00.004008", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18662 1726867313.85184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867313.85192: stdout chunk (state=3): >>><<< 18662 1726867313.85195: stderr chunk (state=3): >>><<< 18662 1726867313.85198: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-20 17:21:53.828279", "end": "2024-09-20 17:21:53.832287", "delta": "0:00:00.004008", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867313.85200: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867313.566594-19019-214153566675487/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867313.85202: _low_level_execute_command(): starting 18662 1726867313.85204: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867313.566594-19019-214153566675487/ > /dev/null 2>&1 && sleep 0' 18662 1726867313.85745: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867313.85753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867313.85764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867313.85781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867313.85793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867313.85800: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867313.85812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867313.85824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18662 1726867313.85841: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 18662 1726867313.85849: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18662 1726867313.85932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867313.85955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.86023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867313.87878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867313.88082: stderr chunk (state=3): >>><<< 18662 1726867313.88085: stdout chunk (state=3): >>><<< 18662 1726867313.88088: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867313.88090: handler run complete 18662 1726867313.88092: Evaluated conditional (False): False 18662 1726867313.88094: attempt loop complete, returning result 18662 1726867313.88096: variable 'item' from source: unknown 18662 1726867313.88098: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set peerlsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr27", "up" ], "delta": "0:00:00.004008", "end": "2024-09-20 17:21:53.832287", "item": "ip link set peerlsr27 up", "rc": 0, "start": "2024-09-20 17:21:53.828279" } 18662 1726867313.88211: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867313.88215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867313.88218: variable 'omit' from source: magic vars 18662 1726867313.88368: variable 'ansible_distribution_major_version' from source: facts 18662 1726867313.88371: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867313.88569: variable 'type' from source: set_fact 18662 1726867313.88572: variable 'state' from source: include params 18662 1726867313.88574: variable 'interface' from source: set_fact 18662 1726867313.88579: variable 'current_interfaces' from source: set_fact 18662 1726867313.88586: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 18662 1726867313.88589: variable 'omit' from source: magic vars 18662 1726867313.88682: variable 'omit' from source: magic vars 18662 1726867313.88686: variable 'item' from source: unknown 18662 1726867313.88714: variable 'item' from source: unknown 18662 1726867313.88739: variable 'omit' from source: magic vars 18662 1726867313.88760: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867313.88767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867313.88774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867313.88789: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867313.88792: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867313.88794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867313.88882: Set connection var ansible_timeout to 10 18662 1726867313.88885: Set connection var ansible_connection to ssh 18662 1726867313.88890: Set connection var ansible_shell_executable to /bin/sh 18662 1726867313.88893: Set connection var ansible_shell_type to sh 18662 1726867313.88922: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867313.88925: Set connection var ansible_pipelining to False 18662 1726867313.88928: variable 'ansible_shell_executable' from source: unknown 18662 1726867313.88930: variable 'ansible_connection' from source: unknown 18662 1726867313.88932: variable 'ansible_module_compression' from source: unknown 18662 1726867313.88943: variable 'ansible_shell_type' from source: unknown 18662 1726867313.88946: variable 'ansible_shell_executable' from source: unknown 18662 1726867313.88948: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867313.89182: variable 'ansible_pipelining' from source: unknown 18662 1726867313.89185: variable 'ansible_timeout' from source: unknown 18662 1726867313.89187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867313.89190: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867313.89192: variable 'omit' from source: magic vars 18662 1726867313.89194: starting attempt loop 18662 1726867313.89196: running the handler 18662 1726867313.89198: _low_level_execute_command(): starting 18662 1726867313.89200: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867313.89762: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867313.89785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867313.89797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.89865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867313.91522: stdout chunk (state=3): >>>/root <<< 18662 1726867313.91660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867313.91676: stdout chunk (state=3): >>><<< 18662 1726867313.91690: stderr chunk (state=3): >>><<< 18662 1726867313.91712: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867313.91726: _low_level_execute_command(): starting 18662 1726867313.91735: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867313.917174-19019-86146922055841 `" && echo ansible-tmp-1726867313.917174-19019-86146922055841="` echo /root/.ansible/tmp/ansible-tmp-1726867313.917174-19019-86146922055841 `" ) && sleep 0' 18662 1726867313.92338: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867313.92353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867313.92366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867313.92451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867313.92488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867313.92504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867313.92559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.92597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867313.94522: stdout chunk (state=3): >>>ansible-tmp-1726867313.917174-19019-86146922055841=/root/.ansible/tmp/ansible-tmp-1726867313.917174-19019-86146922055841 <<< 18662 1726867313.94657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867313.94663: stdout chunk (state=3): >>><<< 18662 1726867313.94670: stderr chunk (state=3): >>><<< 18662 1726867313.94685: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867313.917174-19019-86146922055841=/root/.ansible/tmp/ansible-tmp-1726867313.917174-19019-86146922055841 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867313.94707: variable 'ansible_module_compression' from source: unknown 18662 1726867313.94738: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18662 1726867313.94759: variable 'ansible_facts' from source: unknown 18662 1726867313.94803: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867313.917174-19019-86146922055841/AnsiballZ_command.py 18662 1726867313.94891: Sending initial data 18662 1726867313.94900: Sent initial data (154 bytes) 18662 1726867313.95391: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867313.95403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867313.95442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.95491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867313.97057: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18662 1726867313.97063: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867313.97098: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867313.97134: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmp0od4yvct /root/.ansible/tmp/ansible-tmp-1726867313.917174-19019-86146922055841/AnsiballZ_command.py <<< 18662 1726867313.97141: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867313.917174-19019-86146922055841/AnsiballZ_command.py" <<< 18662 1726867313.97171: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmp0od4yvct" to remote "/root/.ansible/tmp/ansible-tmp-1726867313.917174-19019-86146922055841/AnsiballZ_command.py" <<< 18662 1726867313.97175: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867313.917174-19019-86146922055841/AnsiballZ_command.py" <<< 18662 1726867313.97692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867313.97716: stderr chunk (state=3): >>><<< 18662 1726867313.97719: stdout chunk (state=3): >>><<< 18662 1726867313.97746: done transferring module to remote 18662 1726867313.97755: _low_level_execute_command(): starting 18662 1726867313.97758: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867313.917174-19019-86146922055841/ /root/.ansible/tmp/ansible-tmp-1726867313.917174-19019-86146922055841/AnsiballZ_command.py && sleep 0' 18662 1726867313.98369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867313.98373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867313.98375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867313.98409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867313.98485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867314.00279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867314.00294: stderr chunk (state=3): >>><<< 18662 1726867314.00298: stdout chunk (state=3): >>><<< 18662 1726867314.00311: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867314.00314: _low_level_execute_command(): starting 18662 1726867314.00317: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867313.917174-19019-86146922055841/AnsiballZ_command.py && sleep 0' 18662 1726867314.00736: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867314.00782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 18662 1726867314.00790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.00839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867314.00869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867314.00929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867314.16816: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-20 17:21:54.162451", "end": "2024-09-20 17:21:54.166165", "delta": "0:00:00.003714", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18662 1726867314.18360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867314.18391: stderr chunk (state=3): >>><<< 18662 1726867314.18395: stdout chunk (state=3): >>><<< 18662 1726867314.18412: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-20 17:21:54.162451", "end": "2024-09-20 17:21:54.166165", "delta": "0:00:00.003714", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867314.18430: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867313.917174-19019-86146922055841/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867314.18435: _low_level_execute_command(): starting 18662 1726867314.18440: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867313.917174-19019-86146922055841/ > /dev/null 2>&1 && sleep 0' 18662 1726867314.18865: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867314.18872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867314.18897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.18900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867314.18902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.18960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867314.18967: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867314.18969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867314.19007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867314.20834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867314.20862: stderr chunk (state=3): >>><<< 18662 1726867314.20865: stdout chunk (state=3): >>><<< 18662 1726867314.20878: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867314.20882: handler run complete 18662 1726867314.20897: Evaluated conditional (False): False 18662 1726867314.20904: attempt loop complete, returning result 18662 1726867314.20921: variable 'item' from source: unknown 18662 1726867314.20987: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set lsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr27", "up" ], "delta": "0:00:00.003714", "end": "2024-09-20 17:21:54.166165", "item": "ip link set lsr27 up", "rc": 0, "start": "2024-09-20 17:21:54.162451" } 18662 1726867314.21102: dumping result to json 18662 1726867314.21104: done dumping result, returning 18662 1726867314.21107: done running TaskExecutor() for managed_node2/TASK: Create veth interface lsr27 [0affcac9-a3a5-efab-a8ce-000000000135] 18662 1726867314.21109: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000135 18662 1726867314.21153: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000135 18662 1726867314.21156: WORKER PROCESS EXITING 18662 1726867314.21218: no more pending results, returning what we have 18662 1726867314.21221: results queue empty 18662 1726867314.21222: checking for any_errors_fatal 18662 1726867314.21228: done checking for any_errors_fatal 18662 1726867314.21228: checking for max_fail_percentage 18662 1726867314.21229: done checking for max_fail_percentage 18662 1726867314.21230: checking to see if all hosts have failed and the running result is not ok 18662 1726867314.21231: done checking to see if all hosts have failed 18662 1726867314.21231: getting the remaining hosts for this loop 18662 1726867314.21232: done getting the remaining hosts for this loop 18662 1726867314.21236: getting the next task for host managed_node2 18662 1726867314.21242: done getting next task for host managed_node2 18662 1726867314.21244: ^ task is: TASK: Set up veth as managed by NetworkManager 18662 1726867314.21246: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867314.21250: getting variables 18662 1726867314.21251: in VariableManager get_vars() 18662 1726867314.21281: Calling all_inventory to load vars for managed_node2 18662 1726867314.21283: Calling groups_inventory to load vars for managed_node2 18662 1726867314.21286: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867314.21303: Calling all_plugins_play to load vars for managed_node2 18662 1726867314.21305: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867314.21308: Calling groups_plugins_play to load vars for managed_node2 18662 1726867314.21480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867314.21593: done with get_vars() 18662 1726867314.21601: done getting variables 18662 1726867314.21646: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 17:21:54 -0400 (0:00:01.143) 0:00:08.852 ****** 18662 1726867314.21667: entering _queue_task() for managed_node2/command 18662 1726867314.21860: worker is 1 (out of 1 available) 18662 1726867314.21872: exiting _queue_task() for managed_node2/command 18662 1726867314.21884: done queuing things up, now waiting for results queue to drain 18662 1726867314.21885: waiting for pending results... 18662 1726867314.22130: running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager 18662 1726867314.22135: in run() - task 0affcac9-a3a5-efab-a8ce-000000000136 18662 1726867314.22138: variable 'ansible_search_path' from source: unknown 18662 1726867314.22140: variable 'ansible_search_path' from source: unknown 18662 1726867314.22142: calling self._execute() 18662 1726867314.22187: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867314.22191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867314.22201: variable 'omit' from source: magic vars 18662 1726867314.22472: variable 'ansible_distribution_major_version' from source: facts 18662 1726867314.22483: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867314.22584: variable 'type' from source: set_fact 18662 1726867314.22588: variable 'state' from source: include params 18662 1726867314.22593: Evaluated conditional (type == 'veth' and state == 'present'): True 18662 1726867314.22599: variable 'omit' from source: magic vars 18662 1726867314.22624: variable 'omit' from source: magic vars 18662 1726867314.22693: variable 'interface' from source: set_fact 18662 1726867314.22706: variable 'omit' from source: magic vars 18662 1726867314.22737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867314.22770: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867314.22784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867314.22797: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867314.22805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867314.22829: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867314.22832: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867314.22834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867314.22904: Set connection var ansible_timeout to 10 18662 1726867314.22907: Set connection var ansible_connection to ssh 18662 1726867314.22912: Set connection var ansible_shell_executable to /bin/sh 18662 1726867314.22915: Set connection var ansible_shell_type to sh 18662 1726867314.22922: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867314.22926: Set connection var ansible_pipelining to False 18662 1726867314.22945: variable 'ansible_shell_executable' from source: unknown 18662 1726867314.22948: variable 'ansible_connection' from source: unknown 18662 1726867314.22950: variable 'ansible_module_compression' from source: unknown 18662 1726867314.22953: variable 'ansible_shell_type' from source: unknown 18662 1726867314.22955: variable 'ansible_shell_executable' from source: unknown 18662 1726867314.22957: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867314.22960: variable 'ansible_pipelining' from source: unknown 18662 1726867314.22962: variable 'ansible_timeout' from source: unknown 18662 1726867314.22967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867314.23066: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867314.23074: variable 'omit' from source: magic vars 18662 1726867314.23080: starting attempt loop 18662 1726867314.23083: running the handler 18662 1726867314.23100: _low_level_execute_command(): starting 18662 1726867314.23106: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867314.23601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867314.23605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.23611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867314.23614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.23665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867314.23668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867314.23673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867314.23718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867314.25366: stdout chunk (state=3): >>>/root <<< 18662 1726867314.25468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867314.25496: stderr chunk (state=3): >>><<< 18662 1726867314.25503: stdout chunk (state=3): >>><<< 18662 1726867314.25522: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867314.25533: _low_level_execute_command(): starting 18662 1726867314.25539: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867314.2552135-19073-102382993826948 `" && echo ansible-tmp-1726867314.2552135-19073-102382993826948="` echo /root/.ansible/tmp/ansible-tmp-1726867314.2552135-19073-102382993826948 `" ) && sleep 0' 18662 1726867314.25964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867314.25974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.25979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867314.25981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.26025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867314.26028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867314.26034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867314.26076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867314.27949: stdout chunk (state=3): >>>ansible-tmp-1726867314.2552135-19073-102382993826948=/root/.ansible/tmp/ansible-tmp-1726867314.2552135-19073-102382993826948 <<< 18662 1726867314.28056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867314.28079: stderr chunk (state=3): >>><<< 18662 1726867314.28082: stdout chunk (state=3): >>><<< 18662 1726867314.28094: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867314.2552135-19073-102382993826948=/root/.ansible/tmp/ansible-tmp-1726867314.2552135-19073-102382993826948 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867314.28124: variable 'ansible_module_compression' from source: unknown 18662 1726867314.28163: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18662 1726867314.28193: variable 'ansible_facts' from source: unknown 18662 1726867314.28251: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867314.2552135-19073-102382993826948/AnsiballZ_command.py 18662 1726867314.28344: Sending initial data 18662 1726867314.28347: Sent initial data (156 bytes) 18662 1726867314.28766: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867314.28769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.28772: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867314.28774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.28830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867314.28836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867314.28839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867314.28876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867314.30442: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18662 1726867314.30446: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867314.30480: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867314.30520: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpvahzypt_ /root/.ansible/tmp/ansible-tmp-1726867314.2552135-19073-102382993826948/AnsiballZ_command.py <<< 18662 1726867314.30527: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867314.2552135-19073-102382993826948/AnsiballZ_command.py" <<< 18662 1726867314.30557: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpvahzypt_" to remote "/root/.ansible/tmp/ansible-tmp-1726867314.2552135-19073-102382993826948/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867314.2552135-19073-102382993826948/AnsiballZ_command.py" <<< 18662 1726867314.31073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867314.31105: stderr chunk (state=3): >>><<< 18662 1726867314.31109: stdout chunk (state=3): >>><<< 18662 1726867314.31146: done transferring module to remote 18662 1726867314.31158: _low_level_execute_command(): starting 18662 1726867314.31162: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867314.2552135-19073-102382993826948/ /root/.ansible/tmp/ansible-tmp-1726867314.2552135-19073-102382993826948/AnsiballZ_command.py && sleep 0' 18662 1726867314.31547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867314.31550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.31555: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 18662 1726867314.31557: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867314.31559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.31613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867314.31615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867314.31650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867314.33404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867314.33429: stderr chunk (state=3): >>><<< 18662 1726867314.33432: stdout chunk (state=3): >>><<< 18662 1726867314.33445: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867314.33449: _low_level_execute_command(): starting 18662 1726867314.33451: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867314.2552135-19073-102382993826948/AnsiballZ_command.py && sleep 0' 18662 1726867314.33883: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867314.33887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.33889: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867314.33891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 18662 1726867314.33893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.33940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867314.33943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867314.33989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867314.51089: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-20 17:21:54.491642", "end": "2024-09-20 17:21:54.509360", "delta": "0:00:00.017718", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18662 1726867314.52825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867314.52829: stdout chunk (state=3): >>><<< 18662 1726867314.52832: stderr chunk (state=3): >>><<< 18662 1726867314.52851: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-20 17:21:54.491642", "end": "2024-09-20 17:21:54.509360", "delta": "0:00:00.017718", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867314.52915: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr27 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867314.2552135-19073-102382993826948/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867314.52983: _low_level_execute_command(): starting 18662 1726867314.52986: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867314.2552135-19073-102382993826948/ > /dev/null 2>&1 && sleep 0' 18662 1726867314.53613: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867314.53643: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867314.53660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867314.53686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867314.53761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.53811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867314.53829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867314.53867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867314.53935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867314.55981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867314.55985: stdout chunk (state=3): >>><<< 18662 1726867314.55987: stderr chunk (state=3): >>><<< 18662 1726867314.55989: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867314.55992: handler run complete 18662 1726867314.55994: Evaluated conditional (False): False 18662 1726867314.55995: attempt loop complete, returning result 18662 1726867314.55997: _execute() done 18662 1726867314.55999: dumping result to json 18662 1726867314.56000: done dumping result, returning 18662 1726867314.56002: done running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager [0affcac9-a3a5-efab-a8ce-000000000136] 18662 1726867314.56004: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000136 18662 1726867314.56070: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000136 18662 1726867314.56073: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr27", "managed", "true" ], "delta": "0:00:00.017718", "end": "2024-09-20 17:21:54.509360", "rc": 0, "start": "2024-09-20 17:21:54.491642" } 18662 1726867314.56144: no more pending results, returning what we have 18662 1726867314.56148: results queue empty 18662 1726867314.56149: checking for any_errors_fatal 18662 1726867314.56158: done checking for any_errors_fatal 18662 1726867314.56159: checking for max_fail_percentage 18662 1726867314.56161: done checking for max_fail_percentage 18662 1726867314.56161: checking to see if all hosts have failed and the running result is not ok 18662 1726867314.56162: done checking to see if all hosts have failed 18662 1726867314.56163: getting the remaining hosts for this loop 18662 1726867314.56164: done getting the remaining hosts for this loop 18662 1726867314.56168: getting the next task for host managed_node2 18662 1726867314.56175: done getting next task for host managed_node2 18662 1726867314.56285: ^ task is: TASK: Delete veth interface {{ interface }} 18662 1726867314.56292: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867314.56297: getting variables 18662 1726867314.56298: in VariableManager get_vars() 18662 1726867314.56326: Calling all_inventory to load vars for managed_node2 18662 1726867314.56328: Calling groups_inventory to load vars for managed_node2 18662 1726867314.56332: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867314.56342: Calling all_plugins_play to load vars for managed_node2 18662 1726867314.56345: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867314.56347: Calling groups_plugins_play to load vars for managed_node2 18662 1726867314.56609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867314.56820: done with get_vars() 18662 1726867314.56831: done getting variables 18662 1726867314.56889: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18662 1726867314.57010: variable 'interface' from source: set_fact TASK [Delete veth interface lsr27] ********************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 17:21:54 -0400 (0:00:00.353) 0:00:09.205 ****** 18662 1726867314.57038: entering _queue_task() for managed_node2/command 18662 1726867314.57291: worker is 1 (out of 1 available) 18662 1726867314.57303: exiting _queue_task() for managed_node2/command 18662 1726867314.57315: done queuing things up, now waiting for results queue to drain 18662 1726867314.57316: waiting for pending results... 18662 1726867314.57558: running TaskExecutor() for managed_node2/TASK: Delete veth interface lsr27 18662 1726867314.57650: in run() - task 0affcac9-a3a5-efab-a8ce-000000000137 18662 1726867314.57679: variable 'ansible_search_path' from source: unknown 18662 1726867314.57702: variable 'ansible_search_path' from source: unknown 18662 1726867314.57784: calling self._execute() 18662 1726867314.57840: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867314.57852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867314.57867: variable 'omit' from source: magic vars 18662 1726867314.58267: variable 'ansible_distribution_major_version' from source: facts 18662 1726867314.58290: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867314.58514: variable 'type' from source: set_fact 18662 1726867314.58524: variable 'state' from source: include params 18662 1726867314.58543: variable 'interface' from source: set_fact 18662 1726867314.58546: variable 'current_interfaces' from source: set_fact 18662 1726867314.58574: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 18662 1726867314.58579: when evaluation is False, skipping this task 18662 1726867314.58581: _execute() done 18662 1726867314.58584: dumping result to json 18662 1726867314.58653: done dumping result, returning 18662 1726867314.58656: done running TaskExecutor() for managed_node2/TASK: Delete veth interface lsr27 [0affcac9-a3a5-efab-a8ce-000000000137] 18662 1726867314.58659: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000137 18662 1726867314.58729: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000137 18662 1726867314.58733: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 18662 1726867314.58980: no more pending results, returning what we have 18662 1726867314.58984: results queue empty 18662 1726867314.58985: checking for any_errors_fatal 18662 1726867314.58993: done checking for any_errors_fatal 18662 1726867314.58994: checking for max_fail_percentage 18662 1726867314.58996: done checking for max_fail_percentage 18662 1726867314.58997: checking to see if all hosts have failed and the running result is not ok 18662 1726867314.58998: done checking to see if all hosts have failed 18662 1726867314.58998: getting the remaining hosts for this loop 18662 1726867314.58999: done getting the remaining hosts for this loop 18662 1726867314.59003: getting the next task for host managed_node2 18662 1726867314.59008: done getting next task for host managed_node2 18662 1726867314.59011: ^ task is: TASK: Create dummy interface {{ interface }} 18662 1726867314.59014: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867314.59018: getting variables 18662 1726867314.59019: in VariableManager get_vars() 18662 1726867314.59045: Calling all_inventory to load vars for managed_node2 18662 1726867314.59048: Calling groups_inventory to load vars for managed_node2 18662 1726867314.59052: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867314.59062: Calling all_plugins_play to load vars for managed_node2 18662 1726867314.59065: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867314.59067: Calling groups_plugins_play to load vars for managed_node2 18662 1726867314.59276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867314.59524: done with get_vars() 18662 1726867314.59534: done getting variables 18662 1726867314.59597: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18662 1726867314.59707: variable 'interface' from source: set_fact TASK [Create dummy interface lsr27] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 17:21:54 -0400 (0:00:00.026) 0:00:09.232 ****** 18662 1726867314.59740: entering _queue_task() for managed_node2/command 18662 1726867314.59993: worker is 1 (out of 1 available) 18662 1726867314.60004: exiting _queue_task() for managed_node2/command 18662 1726867314.60014: done queuing things up, now waiting for results queue to drain 18662 1726867314.60016: waiting for pending results... 18662 1726867314.60374: running TaskExecutor() for managed_node2/TASK: Create dummy interface lsr27 18662 1726867314.60382: in run() - task 0affcac9-a3a5-efab-a8ce-000000000138 18662 1726867314.60391: variable 'ansible_search_path' from source: unknown 18662 1726867314.60398: variable 'ansible_search_path' from source: unknown 18662 1726867314.60440: calling self._execute() 18662 1726867314.60529: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867314.60541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867314.60556: variable 'omit' from source: magic vars 18662 1726867314.60927: variable 'ansible_distribution_major_version' from source: facts 18662 1726867314.60944: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867314.61160: variable 'type' from source: set_fact 18662 1726867314.61170: variable 'state' from source: include params 18662 1726867314.61182: variable 'interface' from source: set_fact 18662 1726867314.61227: variable 'current_interfaces' from source: set_fact 18662 1726867314.61231: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 18662 1726867314.61234: when evaluation is False, skipping this task 18662 1726867314.61236: _execute() done 18662 1726867314.61243: dumping result to json 18662 1726867314.61245: done dumping result, returning 18662 1726867314.61247: done running TaskExecutor() for managed_node2/TASK: Create dummy interface lsr27 [0affcac9-a3a5-efab-a8ce-000000000138] 18662 1726867314.61249: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000138 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 18662 1726867314.61389: no more pending results, returning what we have 18662 1726867314.61393: results queue empty 18662 1726867314.61394: checking for any_errors_fatal 18662 1726867314.61399: done checking for any_errors_fatal 18662 1726867314.61399: checking for max_fail_percentage 18662 1726867314.61401: done checking for max_fail_percentage 18662 1726867314.61402: checking to see if all hosts have failed and the running result is not ok 18662 1726867314.61402: done checking to see if all hosts have failed 18662 1726867314.61403: getting the remaining hosts for this loop 18662 1726867314.61405: done getting the remaining hosts for this loop 18662 1726867314.61408: getting the next task for host managed_node2 18662 1726867314.61415: done getting next task for host managed_node2 18662 1726867314.61418: ^ task is: TASK: Delete dummy interface {{ interface }} 18662 1726867314.61421: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867314.61424: getting variables 18662 1726867314.61426: in VariableManager get_vars() 18662 1726867314.61456: Calling all_inventory to load vars for managed_node2 18662 1726867314.61458: Calling groups_inventory to load vars for managed_node2 18662 1726867314.61461: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867314.61474: Calling all_plugins_play to load vars for managed_node2 18662 1726867314.61478: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867314.61481: Calling groups_plugins_play to load vars for managed_node2 18662 1726867314.61889: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000138 18662 1726867314.61892: WORKER PROCESS EXITING 18662 1726867314.61916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867314.62102: done with get_vars() 18662 1726867314.62112: done getting variables 18662 1726867314.62174: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18662 1726867314.62288: variable 'interface' from source: set_fact TASK [Delete dummy interface lsr27] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 17:21:54 -0400 (0:00:00.025) 0:00:09.258 ****** 18662 1726867314.62317: entering _queue_task() for managed_node2/command 18662 1726867314.62681: worker is 1 (out of 1 available) 18662 1726867314.62691: exiting _queue_task() for managed_node2/command 18662 1726867314.62700: done queuing things up, now waiting for results queue to drain 18662 1726867314.62701: waiting for pending results... 18662 1726867314.62841: running TaskExecutor() for managed_node2/TASK: Delete dummy interface lsr27 18662 1726867314.62950: in run() - task 0affcac9-a3a5-efab-a8ce-000000000139 18662 1726867314.62971: variable 'ansible_search_path' from source: unknown 18662 1726867314.62981: variable 'ansible_search_path' from source: unknown 18662 1726867314.63026: calling self._execute() 18662 1726867314.63112: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867314.63125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867314.63143: variable 'omit' from source: magic vars 18662 1726867314.63503: variable 'ansible_distribution_major_version' from source: facts 18662 1726867314.63520: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867314.63731: variable 'type' from source: set_fact 18662 1726867314.63741: variable 'state' from source: include params 18662 1726867314.63750: variable 'interface' from source: set_fact 18662 1726867314.63762: variable 'current_interfaces' from source: set_fact 18662 1726867314.63775: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 18662 1726867314.63785: when evaluation is False, skipping this task 18662 1726867314.63873: _execute() done 18662 1726867314.63876: dumping result to json 18662 1726867314.63881: done dumping result, returning 18662 1726867314.63883: done running TaskExecutor() for managed_node2/TASK: Delete dummy interface lsr27 [0affcac9-a3a5-efab-a8ce-000000000139] 18662 1726867314.63885: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000139 18662 1726867314.63947: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000139 18662 1726867314.63951: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 18662 1726867314.64006: no more pending results, returning what we have 18662 1726867314.64010: results queue empty 18662 1726867314.64012: checking for any_errors_fatal 18662 1726867314.64017: done checking for any_errors_fatal 18662 1726867314.64018: checking for max_fail_percentage 18662 1726867314.64020: done checking for max_fail_percentage 18662 1726867314.64020: checking to see if all hosts have failed and the running result is not ok 18662 1726867314.64021: done checking to see if all hosts have failed 18662 1726867314.64022: getting the remaining hosts for this loop 18662 1726867314.64023: done getting the remaining hosts for this loop 18662 1726867314.64027: getting the next task for host managed_node2 18662 1726867314.64034: done getting next task for host managed_node2 18662 1726867314.64037: ^ task is: TASK: Create tap interface {{ interface }} 18662 1726867314.64041: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867314.64045: getting variables 18662 1726867314.64047: in VariableManager get_vars() 18662 1726867314.64080: Calling all_inventory to load vars for managed_node2 18662 1726867314.64083: Calling groups_inventory to load vars for managed_node2 18662 1726867314.64183: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867314.64200: Calling all_plugins_play to load vars for managed_node2 18662 1726867314.64204: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867314.64207: Calling groups_plugins_play to load vars for managed_node2 18662 1726867314.64552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867314.64751: done with get_vars() 18662 1726867314.64761: done getting variables 18662 1726867314.64818: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18662 1726867314.64929: variable 'interface' from source: set_fact TASK [Create tap interface lsr27] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 17:21:54 -0400 (0:00:00.026) 0:00:09.285 ****** 18662 1726867314.64963: entering _queue_task() for managed_node2/command 18662 1726867314.65306: worker is 1 (out of 1 available) 18662 1726867314.65318: exiting _queue_task() for managed_node2/command 18662 1726867314.65326: done queuing things up, now waiting for results queue to drain 18662 1726867314.65328: waiting for pending results... 18662 1726867314.65563: running TaskExecutor() for managed_node2/TASK: Create tap interface lsr27 18662 1726867314.65581: in run() - task 0affcac9-a3a5-efab-a8ce-00000000013a 18662 1726867314.65599: variable 'ansible_search_path' from source: unknown 18662 1726867314.65683: variable 'ansible_search_path' from source: unknown 18662 1726867314.65687: calling self._execute() 18662 1726867314.65735: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867314.65746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867314.65761: variable 'omit' from source: magic vars 18662 1726867314.66122: variable 'ansible_distribution_major_version' from source: facts 18662 1726867314.66142: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867314.66360: variable 'type' from source: set_fact 18662 1726867314.66375: variable 'state' from source: include params 18662 1726867314.66457: variable 'interface' from source: set_fact 18662 1726867314.66460: variable 'current_interfaces' from source: set_fact 18662 1726867314.66463: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 18662 1726867314.66466: when evaluation is False, skipping this task 18662 1726867314.66468: _execute() done 18662 1726867314.66470: dumping result to json 18662 1726867314.66478: done dumping result, returning 18662 1726867314.66481: done running TaskExecutor() for managed_node2/TASK: Create tap interface lsr27 [0affcac9-a3a5-efab-a8ce-00000000013a] 18662 1726867314.66483: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000013a 18662 1726867314.66544: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000013a 18662 1726867314.66547: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 18662 1726867314.66611: no more pending results, returning what we have 18662 1726867314.66614: results queue empty 18662 1726867314.66615: checking for any_errors_fatal 18662 1726867314.66621: done checking for any_errors_fatal 18662 1726867314.66622: checking for max_fail_percentage 18662 1726867314.66624: done checking for max_fail_percentage 18662 1726867314.66624: checking to see if all hosts have failed and the running result is not ok 18662 1726867314.66625: done checking to see if all hosts have failed 18662 1726867314.66626: getting the remaining hosts for this loop 18662 1726867314.66627: done getting the remaining hosts for this loop 18662 1726867314.66630: getting the next task for host managed_node2 18662 1726867314.66637: done getting next task for host managed_node2 18662 1726867314.66639: ^ task is: TASK: Delete tap interface {{ interface }} 18662 1726867314.66643: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867314.66647: getting variables 18662 1726867314.66648: in VariableManager get_vars() 18662 1726867314.66675: Calling all_inventory to load vars for managed_node2 18662 1726867314.66680: Calling groups_inventory to load vars for managed_node2 18662 1726867314.66684: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867314.66792: Calling all_plugins_play to load vars for managed_node2 18662 1726867314.66883: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867314.66887: Calling groups_plugins_play to load vars for managed_node2 18662 1726867314.67063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867314.67266: done with get_vars() 18662 1726867314.67276: done getting variables 18662 1726867314.67331: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18662 1726867314.67439: variable 'interface' from source: set_fact TASK [Delete tap interface lsr27] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 17:21:54 -0400 (0:00:00.025) 0:00:09.310 ****** 18662 1726867314.67471: entering _queue_task() for managed_node2/command 18662 1726867314.67796: worker is 1 (out of 1 available) 18662 1726867314.67805: exiting _queue_task() for managed_node2/command 18662 1726867314.67815: done queuing things up, now waiting for results queue to drain 18662 1726867314.67816: waiting for pending results... 18662 1726867314.68004: running TaskExecutor() for managed_node2/TASK: Delete tap interface lsr27 18662 1726867314.68047: in run() - task 0affcac9-a3a5-efab-a8ce-00000000013b 18662 1726867314.68066: variable 'ansible_search_path' from source: unknown 18662 1726867314.68103: variable 'ansible_search_path' from source: unknown 18662 1726867314.68118: calling self._execute() 18662 1726867314.68199: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867314.68217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867314.68231: variable 'omit' from source: magic vars 18662 1726867314.68584: variable 'ansible_distribution_major_version' from source: facts 18662 1726867314.68647: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867314.68811: variable 'type' from source: set_fact 18662 1726867314.68822: variable 'state' from source: include params 18662 1726867314.68830: variable 'interface' from source: set_fact 18662 1726867314.68838: variable 'current_interfaces' from source: set_fact 18662 1726867314.68849: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 18662 1726867314.68856: when evaluation is False, skipping this task 18662 1726867314.68868: _execute() done 18662 1726867314.68875: dumping result to json 18662 1726867314.68884: done dumping result, returning 18662 1726867314.68907: done running TaskExecutor() for managed_node2/TASK: Delete tap interface lsr27 [0affcac9-a3a5-efab-a8ce-00000000013b] 18662 1726867314.68910: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000013b 18662 1726867314.69037: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000013b 18662 1726867314.69040: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 18662 1726867314.69190: no more pending results, returning what we have 18662 1726867314.69193: results queue empty 18662 1726867314.69194: checking for any_errors_fatal 18662 1726867314.69199: done checking for any_errors_fatal 18662 1726867314.69200: checking for max_fail_percentage 18662 1726867314.69201: done checking for max_fail_percentage 18662 1726867314.69202: checking to see if all hosts have failed and the running result is not ok 18662 1726867314.69203: done checking to see if all hosts have failed 18662 1726867314.69203: getting the remaining hosts for this loop 18662 1726867314.69204: done getting the remaining hosts for this loop 18662 1726867314.69207: getting the next task for host managed_node2 18662 1726867314.69214: done getting next task for host managed_node2 18662 1726867314.69216: ^ task is: TASK: Include the task 'assert_device_present.yml' 18662 1726867314.69218: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867314.69222: getting variables 18662 1726867314.69223: in VariableManager get_vars() 18662 1726867314.69245: Calling all_inventory to load vars for managed_node2 18662 1726867314.69247: Calling groups_inventory to load vars for managed_node2 18662 1726867314.69250: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867314.69259: Calling all_plugins_play to load vars for managed_node2 18662 1726867314.69261: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867314.69264: Calling groups_plugins_play to load vars for managed_node2 18662 1726867314.69521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867314.69713: done with get_vars() 18662 1726867314.69722: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:30 Friday 20 September 2024 17:21:54 -0400 (0:00:00.023) 0:00:09.333 ****** 18662 1726867314.69802: entering _queue_task() for managed_node2/include_tasks 18662 1726867314.70009: worker is 1 (out of 1 available) 18662 1726867314.70021: exiting _queue_task() for managed_node2/include_tasks 18662 1726867314.70144: done queuing things up, now waiting for results queue to drain 18662 1726867314.70146: waiting for pending results... 18662 1726867314.70375: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' 18662 1726867314.70383: in run() - task 0affcac9-a3a5-efab-a8ce-000000000012 18662 1726867314.70386: variable 'ansible_search_path' from source: unknown 18662 1726867314.70409: calling self._execute() 18662 1726867314.70490: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867314.70502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867314.70517: variable 'omit' from source: magic vars 18662 1726867314.70856: variable 'ansible_distribution_major_version' from source: facts 18662 1726867314.70873: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867314.70887: _execute() done 18662 1726867314.70895: dumping result to json 18662 1726867314.70907: done dumping result, returning 18662 1726867314.71019: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' [0affcac9-a3a5-efab-a8ce-000000000012] 18662 1726867314.71022: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000012 18662 1726867314.71087: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000012 18662 1726867314.71090: WORKER PROCESS EXITING 18662 1726867314.71118: no more pending results, returning what we have 18662 1726867314.71125: in VariableManager get_vars() 18662 1726867314.71156: Calling all_inventory to load vars for managed_node2 18662 1726867314.71159: Calling groups_inventory to load vars for managed_node2 18662 1726867314.71163: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867314.71176: Calling all_plugins_play to load vars for managed_node2 18662 1726867314.71181: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867314.71184: Calling groups_plugins_play to load vars for managed_node2 18662 1726867314.71464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867314.71671: done with get_vars() 18662 1726867314.71681: variable 'ansible_search_path' from source: unknown 18662 1726867314.71693: we have included files to process 18662 1726867314.71694: generating all_blocks data 18662 1726867314.71695: done generating all_blocks data 18662 1726867314.71699: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 18662 1726867314.71700: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 18662 1726867314.71703: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 18662 1726867314.71857: in VariableManager get_vars() 18662 1726867314.71871: done with get_vars() 18662 1726867314.72003: done processing included file 18662 1726867314.72004: iterating over new_blocks loaded from include file 18662 1726867314.72005: in VariableManager get_vars() 18662 1726867314.72013: done with get_vars() 18662 1726867314.72014: filtering new block on tags 18662 1726867314.72024: done filtering new block on tags 18662 1726867314.72025: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 18662 1726867314.72029: extending task lists for all hosts with included blocks 18662 1726867314.72380: done extending task lists 18662 1726867314.72381: done processing included files 18662 1726867314.72382: results queue empty 18662 1726867314.72382: checking for any_errors_fatal 18662 1726867314.72383: done checking for any_errors_fatal 18662 1726867314.72384: checking for max_fail_percentage 18662 1726867314.72384: done checking for max_fail_percentage 18662 1726867314.72385: checking to see if all hosts have failed and the running result is not ok 18662 1726867314.72385: done checking to see if all hosts have failed 18662 1726867314.72386: getting the remaining hosts for this loop 18662 1726867314.72386: done getting the remaining hosts for this loop 18662 1726867314.72388: getting the next task for host managed_node2 18662 1726867314.72390: done getting next task for host managed_node2 18662 1726867314.72391: ^ task is: TASK: Include the task 'get_interface_stat.yml' 18662 1726867314.72393: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867314.72394: getting variables 18662 1726867314.72395: in VariableManager get_vars() 18662 1726867314.72400: Calling all_inventory to load vars for managed_node2 18662 1726867314.72401: Calling groups_inventory to load vars for managed_node2 18662 1726867314.72402: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867314.72405: Calling all_plugins_play to load vars for managed_node2 18662 1726867314.72407: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867314.72409: Calling groups_plugins_play to load vars for managed_node2 18662 1726867314.72489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867314.72593: done with get_vars() 18662 1726867314.72600: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:21:54 -0400 (0:00:00.028) 0:00:09.361 ****** 18662 1726867314.72644: entering _queue_task() for managed_node2/include_tasks 18662 1726867314.72803: worker is 1 (out of 1 available) 18662 1726867314.72814: exiting _queue_task() for managed_node2/include_tasks 18662 1726867314.72825: done queuing things up, now waiting for results queue to drain 18662 1726867314.72827: waiting for pending results... 18662 1726867314.72973: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 18662 1726867314.73033: in run() - task 0affcac9-a3a5-efab-a8ce-0000000001d3 18662 1726867314.73043: variable 'ansible_search_path' from source: unknown 18662 1726867314.73049: variable 'ansible_search_path' from source: unknown 18662 1726867314.73076: calling self._execute() 18662 1726867314.73142: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867314.73146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867314.73154: variable 'omit' from source: magic vars 18662 1726867314.73403: variable 'ansible_distribution_major_version' from source: facts 18662 1726867314.73415: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867314.73419: _execute() done 18662 1726867314.73423: dumping result to json 18662 1726867314.73426: done dumping result, returning 18662 1726867314.73432: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-efab-a8ce-0000000001d3] 18662 1726867314.73437: sending task result for task 0affcac9-a3a5-efab-a8ce-0000000001d3 18662 1726867314.73512: done sending task result for task 0affcac9-a3a5-efab-a8ce-0000000001d3 18662 1726867314.73515: WORKER PROCESS EXITING 18662 1726867314.73543: no more pending results, returning what we have 18662 1726867314.73548: in VariableManager get_vars() 18662 1726867314.73576: Calling all_inventory to load vars for managed_node2 18662 1726867314.73581: Calling groups_inventory to load vars for managed_node2 18662 1726867314.73583: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867314.73593: Calling all_plugins_play to load vars for managed_node2 18662 1726867314.73595: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867314.73598: Calling groups_plugins_play to load vars for managed_node2 18662 1726867314.73760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867314.73964: done with get_vars() 18662 1726867314.73971: variable 'ansible_search_path' from source: unknown 18662 1726867314.73972: variable 'ansible_search_path' from source: unknown 18662 1726867314.74007: we have included files to process 18662 1726867314.74008: generating all_blocks data 18662 1726867314.74010: done generating all_blocks data 18662 1726867314.74011: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18662 1726867314.74013: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18662 1726867314.74015: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18662 1726867314.74215: done processing included file 18662 1726867314.74217: iterating over new_blocks loaded from include file 18662 1726867314.74218: in VariableManager get_vars() 18662 1726867314.74229: done with get_vars() 18662 1726867314.74231: filtering new block on tags 18662 1726867314.74243: done filtering new block on tags 18662 1726867314.74245: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 18662 1726867314.74249: extending task lists for all hosts with included blocks 18662 1726867314.74339: done extending task lists 18662 1726867314.74340: done processing included files 18662 1726867314.74341: results queue empty 18662 1726867314.74342: checking for any_errors_fatal 18662 1726867314.74344: done checking for any_errors_fatal 18662 1726867314.74345: checking for max_fail_percentage 18662 1726867314.74346: done checking for max_fail_percentage 18662 1726867314.74347: checking to see if all hosts have failed and the running result is not ok 18662 1726867314.74347: done checking to see if all hosts have failed 18662 1726867314.74348: getting the remaining hosts for this loop 18662 1726867314.74349: done getting the remaining hosts for this loop 18662 1726867314.74351: getting the next task for host managed_node2 18662 1726867314.74355: done getting next task for host managed_node2 18662 1726867314.74357: ^ task is: TASK: Get stat for interface {{ interface }} 18662 1726867314.74359: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867314.74361: getting variables 18662 1726867314.74362: in VariableManager get_vars() 18662 1726867314.74369: Calling all_inventory to load vars for managed_node2 18662 1726867314.74371: Calling groups_inventory to load vars for managed_node2 18662 1726867314.74373: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867314.74380: Calling all_plugins_play to load vars for managed_node2 18662 1726867314.74382: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867314.74385: Calling groups_plugins_play to load vars for managed_node2 18662 1726867314.74515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867314.74694: done with get_vars() 18662 1726867314.74702: done getting variables 18662 1726867314.74822: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:21:54 -0400 (0:00:00.021) 0:00:09.383 ****** 18662 1726867314.74842: entering _queue_task() for managed_node2/stat 18662 1726867314.75016: worker is 1 (out of 1 available) 18662 1726867314.75028: exiting _queue_task() for managed_node2/stat 18662 1726867314.75038: done queuing things up, now waiting for results queue to drain 18662 1726867314.75039: waiting for pending results... 18662 1726867314.75179: running TaskExecutor() for managed_node2/TASK: Get stat for interface lsr27 18662 1726867314.75242: in run() - task 0affcac9-a3a5-efab-a8ce-00000000021e 18662 1726867314.75252: variable 'ansible_search_path' from source: unknown 18662 1726867314.75258: variable 'ansible_search_path' from source: unknown 18662 1726867314.75284: calling self._execute() 18662 1726867314.75339: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867314.75342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867314.75351: variable 'omit' from source: magic vars 18662 1726867314.75642: variable 'ansible_distribution_major_version' from source: facts 18662 1726867314.75650: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867314.75657: variable 'omit' from source: magic vars 18662 1726867314.75688: variable 'omit' from source: magic vars 18662 1726867314.75756: variable 'interface' from source: set_fact 18662 1726867314.75769: variable 'omit' from source: magic vars 18662 1726867314.75800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867314.75829: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867314.75848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867314.75860: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867314.75869: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867314.75892: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867314.75896: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867314.75899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867314.75965: Set connection var ansible_timeout to 10 18662 1726867314.75968: Set connection var ansible_connection to ssh 18662 1726867314.75973: Set connection var ansible_shell_executable to /bin/sh 18662 1726867314.75975: Set connection var ansible_shell_type to sh 18662 1726867314.75985: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867314.75989: Set connection var ansible_pipelining to False 18662 1726867314.76007: variable 'ansible_shell_executable' from source: unknown 18662 1726867314.76010: variable 'ansible_connection' from source: unknown 18662 1726867314.76015: variable 'ansible_module_compression' from source: unknown 18662 1726867314.76017: variable 'ansible_shell_type' from source: unknown 18662 1726867314.76022: variable 'ansible_shell_executable' from source: unknown 18662 1726867314.76024: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867314.76026: variable 'ansible_pipelining' from source: unknown 18662 1726867314.76029: variable 'ansible_timeout' from source: unknown 18662 1726867314.76034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867314.76172: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867314.76180: variable 'omit' from source: magic vars 18662 1726867314.76186: starting attempt loop 18662 1726867314.76189: running the handler 18662 1726867314.76200: _low_level_execute_command(): starting 18662 1726867314.76207: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867314.76701: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867314.76706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.76711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867314.76714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.76807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867314.76864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867314.78563: stdout chunk (state=3): >>>/root <<< 18662 1726867314.78663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867314.78686: stderr chunk (state=3): >>><<< 18662 1726867314.78690: stdout chunk (state=3): >>><<< 18662 1726867314.78708: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867314.78722: _low_level_execute_command(): starting 18662 1726867314.78728: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867314.7871118-19102-185443717451185 `" && echo ansible-tmp-1726867314.7871118-19102-185443717451185="` echo /root/.ansible/tmp/ansible-tmp-1726867314.7871118-19102-185443717451185 `" ) && sleep 0' 18662 1726867314.79129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867314.79132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.79135: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867314.79142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.79187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867314.79191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867314.79250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867314.81147: stdout chunk (state=3): >>>ansible-tmp-1726867314.7871118-19102-185443717451185=/root/.ansible/tmp/ansible-tmp-1726867314.7871118-19102-185443717451185 <<< 18662 1726867314.81259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867314.81280: stderr chunk (state=3): >>><<< 18662 1726867314.81284: stdout chunk (state=3): >>><<< 18662 1726867314.81296: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867314.7871118-19102-185443717451185=/root/.ansible/tmp/ansible-tmp-1726867314.7871118-19102-185443717451185 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867314.81332: variable 'ansible_module_compression' from source: unknown 18662 1726867314.81373: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 18662 1726867314.81402: variable 'ansible_facts' from source: unknown 18662 1726867314.81462: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867314.7871118-19102-185443717451185/AnsiballZ_stat.py 18662 1726867314.81549: Sending initial data 18662 1726867314.81553: Sent initial data (153 bytes) 18662 1726867314.81956: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867314.81960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867314.81962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18662 1726867314.81964: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867314.81966: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.82022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867314.82025: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867314.82060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867314.83642: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867314.83678: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867314.83737: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmp7nf3s2rn /root/.ansible/tmp/ansible-tmp-1726867314.7871118-19102-185443717451185/AnsiballZ_stat.py <<< 18662 1726867314.83740: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867314.7871118-19102-185443717451185/AnsiballZ_stat.py" <<< 18662 1726867314.83806: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmp7nf3s2rn" to remote "/root/.ansible/tmp/ansible-tmp-1726867314.7871118-19102-185443717451185/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867314.7871118-19102-185443717451185/AnsiballZ_stat.py" <<< 18662 1726867314.84622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867314.84625: stderr chunk (state=3): >>><<< 18662 1726867314.84627: stdout chunk (state=3): >>><<< 18662 1726867314.84629: done transferring module to remote 18662 1726867314.84631: _low_level_execute_command(): starting 18662 1726867314.84634: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867314.7871118-19102-185443717451185/ /root/.ansible/tmp/ansible-tmp-1726867314.7871118-19102-185443717451185/AnsiballZ_stat.py && sleep 0' 18662 1726867314.85124: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867314.85140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867314.85155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867314.85172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867314.85218: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.85288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867314.85305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867314.85331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867314.85401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867314.87202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867314.87222: stderr chunk (state=3): >>><<< 18662 1726867314.87225: stdout chunk (state=3): >>><<< 18662 1726867314.87239: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867314.87242: _low_level_execute_command(): starting 18662 1726867314.87244: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867314.7871118-19102-185443717451185/AnsiballZ_stat.py && sleep 0' 18662 1726867314.87823: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867314.87827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867314.87829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867314.87870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867315.03568: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28148, "dev": 23, "nlink": 1, "atime": 1726867313.4380338, "mtime": 1726867313.4380338, "ctime": 1726867313.4380338, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 18662 1726867315.04914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867315.04939: stderr chunk (state=3): >>><<< 18662 1726867315.04942: stdout chunk (state=3): >>><<< 18662 1726867315.04956: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28148, "dev": 23, "nlink": 1, "atime": 1726867313.4380338, "mtime": 1726867313.4380338, "ctime": 1726867313.4380338, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867315.04994: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867314.7871118-19102-185443717451185/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867315.05002: _low_level_execute_command(): starting 18662 1726867315.05006: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867314.7871118-19102-185443717451185/ > /dev/null 2>&1 && sleep 0' 18662 1726867315.05414: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867315.05422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867315.05424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867315.05426: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867315.05429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867315.05483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867315.05486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867315.05521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867315.07333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867315.07352: stderr chunk (state=3): >>><<< 18662 1726867315.07355: stdout chunk (state=3): >>><<< 18662 1726867315.07366: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867315.07371: handler run complete 18662 1726867315.07408: attempt loop complete, returning result 18662 1726867315.07411: _execute() done 18662 1726867315.07416: dumping result to json 18662 1726867315.07422: done dumping result, returning 18662 1726867315.07429: done running TaskExecutor() for managed_node2/TASK: Get stat for interface lsr27 [0affcac9-a3a5-efab-a8ce-00000000021e] 18662 1726867315.07436: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000021e 18662 1726867315.07535: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000021e 18662 1726867315.07537: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726867313.4380338, "block_size": 4096, "blocks": 0, "ctime": 1726867313.4380338, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28148, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "mode": "0777", "mtime": 1726867313.4380338, "nlink": 1, "path": "/sys/class/net/lsr27", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 18662 1726867315.07618: no more pending results, returning what we have 18662 1726867315.07621: results queue empty 18662 1726867315.07622: checking for any_errors_fatal 18662 1726867315.07624: done checking for any_errors_fatal 18662 1726867315.07624: checking for max_fail_percentage 18662 1726867315.07626: done checking for max_fail_percentage 18662 1726867315.07627: checking to see if all hosts have failed and the running result is not ok 18662 1726867315.07627: done checking to see if all hosts have failed 18662 1726867315.07628: getting the remaining hosts for this loop 18662 1726867315.07629: done getting the remaining hosts for this loop 18662 1726867315.07633: getting the next task for host managed_node2 18662 1726867315.07640: done getting next task for host managed_node2 18662 1726867315.07642: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 18662 1726867315.07645: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867315.07649: getting variables 18662 1726867315.07650: in VariableManager get_vars() 18662 1726867315.07733: Calling all_inventory to load vars for managed_node2 18662 1726867315.07735: Calling groups_inventory to load vars for managed_node2 18662 1726867315.07738: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867315.07748: Calling all_plugins_play to load vars for managed_node2 18662 1726867315.07750: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867315.07753: Calling groups_plugins_play to load vars for managed_node2 18662 1726867315.07862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867315.07986: done with get_vars() 18662 1726867315.07993: done getting variables 18662 1726867315.08064: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 18662 1726867315.08151: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'lsr27'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:21:55 -0400 (0:00:00.333) 0:00:09.717 ****** 18662 1726867315.08172: entering _queue_task() for managed_node2/assert 18662 1726867315.08173: Creating lock for assert 18662 1726867315.08359: worker is 1 (out of 1 available) 18662 1726867315.08372: exiting _queue_task() for managed_node2/assert 18662 1726867315.08386: done queuing things up, now waiting for results queue to drain 18662 1726867315.08387: waiting for pending results... 18662 1726867315.08538: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'lsr27' 18662 1726867315.08601: in run() - task 0affcac9-a3a5-efab-a8ce-0000000001d4 18662 1726867315.08619: variable 'ansible_search_path' from source: unknown 18662 1726867315.08623: variable 'ansible_search_path' from source: unknown 18662 1726867315.08647: calling self._execute() 18662 1726867315.08704: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867315.08709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867315.08724: variable 'omit' from source: magic vars 18662 1726867315.08971: variable 'ansible_distribution_major_version' from source: facts 18662 1726867315.08981: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867315.08987: variable 'omit' from source: magic vars 18662 1726867315.09015: variable 'omit' from source: magic vars 18662 1726867315.09085: variable 'interface' from source: set_fact 18662 1726867315.09099: variable 'omit' from source: magic vars 18662 1726867315.09132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867315.09159: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867315.09178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867315.09191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867315.09199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867315.09224: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867315.09227: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867315.09229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867315.09300: Set connection var ansible_timeout to 10 18662 1726867315.09303: Set connection var ansible_connection to ssh 18662 1726867315.09308: Set connection var ansible_shell_executable to /bin/sh 18662 1726867315.09313: Set connection var ansible_shell_type to sh 18662 1726867315.09322: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867315.09327: Set connection var ansible_pipelining to False 18662 1726867315.09345: variable 'ansible_shell_executable' from source: unknown 18662 1726867315.09348: variable 'ansible_connection' from source: unknown 18662 1726867315.09351: variable 'ansible_module_compression' from source: unknown 18662 1726867315.09353: variable 'ansible_shell_type' from source: unknown 18662 1726867315.09355: variable 'ansible_shell_executable' from source: unknown 18662 1726867315.09357: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867315.09362: variable 'ansible_pipelining' from source: unknown 18662 1726867315.09364: variable 'ansible_timeout' from source: unknown 18662 1726867315.09369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867315.09468: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867315.09475: variable 'omit' from source: magic vars 18662 1726867315.09485: starting attempt loop 18662 1726867315.09488: running the handler 18662 1726867315.09571: variable 'interface_stat' from source: set_fact 18662 1726867315.09588: Evaluated conditional (interface_stat.stat.exists): True 18662 1726867315.09591: handler run complete 18662 1726867315.09605: attempt loop complete, returning result 18662 1726867315.09608: _execute() done 18662 1726867315.09610: dumping result to json 18662 1726867315.09613: done dumping result, returning 18662 1726867315.09620: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'lsr27' [0affcac9-a3a5-efab-a8ce-0000000001d4] 18662 1726867315.09625: sending task result for task 0affcac9-a3a5-efab-a8ce-0000000001d4 18662 1726867315.09702: done sending task result for task 0affcac9-a3a5-efab-a8ce-0000000001d4 18662 1726867315.09705: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 18662 1726867315.09754: no more pending results, returning what we have 18662 1726867315.09757: results queue empty 18662 1726867315.09758: checking for any_errors_fatal 18662 1726867315.09764: done checking for any_errors_fatal 18662 1726867315.09764: checking for max_fail_percentage 18662 1726867315.09766: done checking for max_fail_percentage 18662 1726867315.09766: checking to see if all hosts have failed and the running result is not ok 18662 1726867315.09767: done checking to see if all hosts have failed 18662 1726867315.09768: getting the remaining hosts for this loop 18662 1726867315.09769: done getting the remaining hosts for this loop 18662 1726867315.09772: getting the next task for host managed_node2 18662 1726867315.09781: done getting next task for host managed_node2 18662 1726867315.09783: ^ task is: TASK: meta (flush_handlers) 18662 1726867315.09785: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867315.09788: getting variables 18662 1726867315.09789: in VariableManager get_vars() 18662 1726867315.09811: Calling all_inventory to load vars for managed_node2 18662 1726867315.09813: Calling groups_inventory to load vars for managed_node2 18662 1726867315.09817: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867315.09826: Calling all_plugins_play to load vars for managed_node2 18662 1726867315.09829: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867315.09831: Calling groups_plugins_play to load vars for managed_node2 18662 1726867315.09943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867315.10084: done with get_vars() 18662 1726867315.10093: done getting variables 18662 1726867315.10156: in VariableManager get_vars() 18662 1726867315.10162: Calling all_inventory to load vars for managed_node2 18662 1726867315.10164: Calling groups_inventory to load vars for managed_node2 18662 1726867315.10165: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867315.10168: Calling all_plugins_play to load vars for managed_node2 18662 1726867315.10169: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867315.10171: Calling groups_plugins_play to load vars for managed_node2 18662 1726867315.10255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867315.10372: done with get_vars() 18662 1726867315.10383: done queuing things up, now waiting for results queue to drain 18662 1726867315.10384: results queue empty 18662 1726867315.10385: checking for any_errors_fatal 18662 1726867315.10386: done checking for any_errors_fatal 18662 1726867315.10386: checking for max_fail_percentage 18662 1726867315.10387: done checking for max_fail_percentage 18662 1726867315.10387: checking to see if all hosts have failed and the running result is not ok 18662 1726867315.10388: done checking to see if all hosts have failed 18662 1726867315.10392: getting the remaining hosts for this loop 18662 1726867315.10393: done getting the remaining hosts for this loop 18662 1726867315.10394: getting the next task for host managed_node2 18662 1726867315.10396: done getting next task for host managed_node2 18662 1726867315.10397: ^ task is: TASK: meta (flush_handlers) 18662 1726867315.10398: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867315.10400: getting variables 18662 1726867315.10401: in VariableManager get_vars() 18662 1726867315.10407: Calling all_inventory to load vars for managed_node2 18662 1726867315.10409: Calling groups_inventory to load vars for managed_node2 18662 1726867315.10411: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867315.10414: Calling all_plugins_play to load vars for managed_node2 18662 1726867315.10415: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867315.10417: Calling groups_plugins_play to load vars for managed_node2 18662 1726867315.10495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867315.10614: done with get_vars() 18662 1726867315.10620: done getting variables 18662 1726867315.10649: in VariableManager get_vars() 18662 1726867315.10655: Calling all_inventory to load vars for managed_node2 18662 1726867315.10656: Calling groups_inventory to load vars for managed_node2 18662 1726867315.10657: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867315.10660: Calling all_plugins_play to load vars for managed_node2 18662 1726867315.10661: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867315.10663: Calling groups_plugins_play to load vars for managed_node2 18662 1726867315.10741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867315.10841: done with get_vars() 18662 1726867315.10850: done queuing things up, now waiting for results queue to drain 18662 1726867315.10851: results queue empty 18662 1726867315.10852: checking for any_errors_fatal 18662 1726867315.10852: done checking for any_errors_fatal 18662 1726867315.10853: checking for max_fail_percentage 18662 1726867315.10854: done checking for max_fail_percentage 18662 1726867315.10854: checking to see if all hosts have failed and the running result is not ok 18662 1726867315.10855: done checking to see if all hosts have failed 18662 1726867315.10855: getting the remaining hosts for this loop 18662 1726867315.10856: done getting the remaining hosts for this loop 18662 1726867315.10857: getting the next task for host managed_node2 18662 1726867315.10859: done getting next task for host managed_node2 18662 1726867315.10859: ^ task is: None 18662 1726867315.10860: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867315.10861: done queuing things up, now waiting for results queue to drain 18662 1726867315.10861: results queue empty 18662 1726867315.10862: checking for any_errors_fatal 18662 1726867315.10862: done checking for any_errors_fatal 18662 1726867315.10863: checking for max_fail_percentage 18662 1726867315.10863: done checking for max_fail_percentage 18662 1726867315.10863: checking to see if all hosts have failed and the running result is not ok 18662 1726867315.10864: done checking to see if all hosts have failed 18662 1726867315.10865: getting the next task for host managed_node2 18662 1726867315.10866: done getting next task for host managed_node2 18662 1726867315.10866: ^ task is: None 18662 1726867315.10867: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867315.10902: in VariableManager get_vars() 18662 1726867315.10917: done with get_vars() 18662 1726867315.10920: in VariableManager get_vars() 18662 1726867315.10928: done with get_vars() 18662 1726867315.10932: variable 'omit' from source: magic vars 18662 1726867315.10952: in VariableManager get_vars() 18662 1726867315.10961: done with get_vars() 18662 1726867315.10975: variable 'omit' from source: magic vars PLAY [Test static interface up] ************************************************ 18662 1726867315.11325: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18662 1726867315.11344: getting the remaining hosts for this loop 18662 1726867315.11345: done getting the remaining hosts for this loop 18662 1726867315.11346: getting the next task for host managed_node2 18662 1726867315.11348: done getting next task for host managed_node2 18662 1726867315.11349: ^ task is: TASK: Gathering Facts 18662 1726867315.11350: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867315.11351: getting variables 18662 1726867315.11352: in VariableManager get_vars() 18662 1726867315.11391: Calling all_inventory to load vars for managed_node2 18662 1726867315.11392: Calling groups_inventory to load vars for managed_node2 18662 1726867315.11394: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867315.11397: Calling all_plugins_play to load vars for managed_node2 18662 1726867315.11398: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867315.11400: Calling groups_plugins_play to load vars for managed_node2 18662 1726867315.11474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867315.11579: done with get_vars() 18662 1726867315.11585: done getting variables 18662 1726867315.11613: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Friday 20 September 2024 17:21:55 -0400 (0:00:00.034) 0:00:09.751 ****** 18662 1726867315.11629: entering _queue_task() for managed_node2/gather_facts 18662 1726867315.11800: worker is 1 (out of 1 available) 18662 1726867315.11812: exiting _queue_task() for managed_node2/gather_facts 18662 1726867315.11822: done queuing things up, now waiting for results queue to drain 18662 1726867315.11824: waiting for pending results... 18662 1726867315.11971: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18662 1726867315.12028: in run() - task 0affcac9-a3a5-efab-a8ce-000000000237 18662 1726867315.12039: variable 'ansible_search_path' from source: unknown 18662 1726867315.12069: calling self._execute() 18662 1726867315.12141: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867315.12382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867315.12386: variable 'omit' from source: magic vars 18662 1726867315.12532: variable 'ansible_distribution_major_version' from source: facts 18662 1726867315.12550: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867315.12560: variable 'omit' from source: magic vars 18662 1726867315.12590: variable 'omit' from source: magic vars 18662 1726867315.12629: variable 'omit' from source: magic vars 18662 1726867315.12670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867315.12713: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867315.12737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867315.12760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867315.12775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867315.12807: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867315.12818: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867315.12826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867315.12927: Set connection var ansible_timeout to 10 18662 1726867315.12936: Set connection var ansible_connection to ssh 18662 1726867315.12962: Set connection var ansible_shell_executable to /bin/sh 18662 1726867315.12970: Set connection var ansible_shell_type to sh 18662 1726867315.12987: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867315.13023: Set connection var ansible_pipelining to False 18662 1726867315.13051: variable 'ansible_shell_executable' from source: unknown 18662 1726867315.13059: variable 'ansible_connection' from source: unknown 18662 1726867315.13066: variable 'ansible_module_compression' from source: unknown 18662 1726867315.13073: variable 'ansible_shell_type' from source: unknown 18662 1726867315.13083: variable 'ansible_shell_executable' from source: unknown 18662 1726867315.13100: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867315.13118: variable 'ansible_pipelining' from source: unknown 18662 1726867315.13133: variable 'ansible_timeout' from source: unknown 18662 1726867315.13144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867315.13295: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867315.13303: variable 'omit' from source: magic vars 18662 1726867315.13307: starting attempt loop 18662 1726867315.13327: running the handler 18662 1726867315.13337: variable 'ansible_facts' from source: unknown 18662 1726867315.13352: _low_level_execute_command(): starting 18662 1726867315.13359: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867315.13836: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867315.13840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867315.13843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867315.13845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867315.13895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867315.13899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867315.13950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867315.15621: stdout chunk (state=3): >>>/root <<< 18662 1726867315.15729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867315.15771: stderr chunk (state=3): >>><<< 18662 1726867315.15775: stdout chunk (state=3): >>><<< 18662 1726867315.15831: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867315.15834: _low_level_execute_command(): starting 18662 1726867315.15837: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867315.1579564-19124-205240572084809 `" && echo ansible-tmp-1726867315.1579564-19124-205240572084809="` echo /root/.ansible/tmp/ansible-tmp-1726867315.1579564-19124-205240572084809 `" ) && sleep 0' 18662 1726867315.16394: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867315.16407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867315.16494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867315.16532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867315.16546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867315.16564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867315.16636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867315.18590: stdout chunk (state=3): >>>ansible-tmp-1726867315.1579564-19124-205240572084809=/root/.ansible/tmp/ansible-tmp-1726867315.1579564-19124-205240572084809 <<< 18662 1726867315.18748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867315.18751: stdout chunk (state=3): >>><<< 18662 1726867315.18754: stderr chunk (state=3): >>><<< 18662 1726867315.18883: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867315.1579564-19124-205240572084809=/root/.ansible/tmp/ansible-tmp-1726867315.1579564-19124-205240572084809 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867315.18886: variable 'ansible_module_compression' from source: unknown 18662 1726867315.18889: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18662 1726867315.18927: variable 'ansible_facts' from source: unknown 18662 1726867315.19152: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867315.1579564-19124-205240572084809/AnsiballZ_setup.py 18662 1726867315.19362: Sending initial data 18662 1726867315.19365: Sent initial data (154 bytes) 18662 1726867315.19974: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867315.19999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867315.20015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867315.20089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867315.21720: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867315.21756: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867315.21796: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpc0f7z8_i /root/.ansible/tmp/ansible-tmp-1726867315.1579564-19124-205240572084809/AnsiballZ_setup.py <<< 18662 1726867315.21814: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867315.1579564-19124-205240572084809/AnsiballZ_setup.py" <<< 18662 1726867315.21851: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpc0f7z8_i" to remote "/root/.ansible/tmp/ansible-tmp-1726867315.1579564-19124-205240572084809/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867315.1579564-19124-205240572084809/AnsiballZ_setup.py" <<< 18662 1726867315.23526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867315.23530: stdout chunk (state=3): >>><<< 18662 1726867315.23533: stderr chunk (state=3): >>><<< 18662 1726867315.23535: done transferring module to remote 18662 1726867315.23537: _low_level_execute_command(): starting 18662 1726867315.23540: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867315.1579564-19124-205240572084809/ /root/.ansible/tmp/ansible-tmp-1726867315.1579564-19124-205240572084809/AnsiballZ_setup.py && sleep 0' 18662 1726867315.24194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867315.24198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867315.24271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867315.24301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867315.24339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867315.24410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867315.26310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867315.26333: stderr chunk (state=3): >>><<< 18662 1726867315.26341: stdout chunk (state=3): >>><<< 18662 1726867315.26367: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867315.26415: _low_level_execute_command(): starting 18662 1726867315.26419: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867315.1579564-19124-205240572084809/AnsiballZ_setup.py && sleep 0' 18662 1726867315.27027: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867315.27041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867315.27092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867315.27170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867315.27214: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867315.27267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867315.92863: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2955, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 576, "free": 2955}, "nocache": {"free": 3293, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 553, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794787328, "block_size": 4096, "block_total": 65519099, "block_available": 63914743, "block_used": 1604356, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "55", "epoch": "1726867315", "epoch_int": "1726867315", "date": "2024-09-20", "time": "17:21:55", "iso8601_micro": "2024-09-20T21:21:55.863958Z", "iso8601": "2024-09-20T21:21:55Z", "iso8601_basic": "20240920T172155863958", "iso8601_basic_short": "20240920T172155", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_interfaces": ["lsr27", "lo", "eth0", "peerlsr27"], "ansible_lsr27": {"device": "lsr27", "macaddress": "32:21:19:c5:50:d2", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3021:19ff:fec5:50d2", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "e2:02:64:1b:da:9a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e002:64ff:fe1b:da9a", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::3021:19ff:fec5:50d2", "fe80::e002:64ff:fe1b:da9a", "fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad", "fe80::3021:19ff:fec5:50d2"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.380859375, "5m": 0.37744140625, "15m": 0.19921875}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18662 1726867315.94860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867315.94888: stderr chunk (state=3): >>>Shared connection to 10.31.12.116 closed. <<< 18662 1726867315.94937: stderr chunk (state=3): >>><<< 18662 1726867315.94947: stdout chunk (state=3): >>><<< 18662 1726867315.95004: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2955, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 576, "free": 2955}, "nocache": {"free": 3293, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 553, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794787328, "block_size": 4096, "block_total": 65519099, "block_available": 63914743, "block_used": 1604356, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "55", "epoch": "1726867315", "epoch_int": "1726867315", "date": "2024-09-20", "time": "17:21:55", "iso8601_micro": "2024-09-20T21:21:55.863958Z", "iso8601": "2024-09-20T21:21:55Z", "iso8601_basic": "20240920T172155863958", "iso8601_basic_short": "20240920T172155", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_interfaces": ["lsr27", "lo", "eth0", "peerlsr27"], "ansible_lsr27": {"device": "lsr27", "macaddress": "32:21:19:c5:50:d2", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3021:19ff:fec5:50d2", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "e2:02:64:1b:da:9a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e002:64ff:fe1b:da9a", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::3021:19ff:fec5:50d2", "fe80::e002:64ff:fe1b:da9a", "fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad", "fe80::3021:19ff:fec5:50d2"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.380859375, "5m": 0.37744140625, "15m": 0.19921875}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867315.95907: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867315.1579564-19124-205240572084809/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867315.95910: _low_level_execute_command(): starting 18662 1726867315.95913: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867315.1579564-19124-205240572084809/ > /dev/null 2>&1 && sleep 0' 18662 1726867315.96430: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867315.96442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867315.96464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867315.96573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867315.96579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867315.96597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867315.96610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867315.96632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867315.96704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867315.98614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867315.98667: stderr chunk (state=3): >>><<< 18662 1726867315.98684: stdout chunk (state=3): >>><<< 18662 1726867315.98705: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867315.98882: handler run complete 18662 1726867315.98885: variable 'ansible_facts' from source: unknown 18662 1726867315.99384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867315.99888: variable 'ansible_facts' from source: unknown 18662 1726867316.00039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867316.00216: attempt loop complete, returning result 18662 1726867316.00230: _execute() done 18662 1726867316.00242: dumping result to json 18662 1726867316.00292: done dumping result, returning 18662 1726867316.00304: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-efab-a8ce-000000000237] 18662 1726867316.00316: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000237 18662 1726867316.01548: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000237 18662 1726867316.01551: WORKER PROCESS EXITING ok: [managed_node2] 18662 1726867316.01890: no more pending results, returning what we have 18662 1726867316.01894: results queue empty 18662 1726867316.01894: checking for any_errors_fatal 18662 1726867316.01895: done checking for any_errors_fatal 18662 1726867316.01896: checking for max_fail_percentage 18662 1726867316.01897: done checking for max_fail_percentage 18662 1726867316.01898: checking to see if all hosts have failed and the running result is not ok 18662 1726867316.01898: done checking to see if all hosts have failed 18662 1726867316.01899: getting the remaining hosts for this loop 18662 1726867316.01900: done getting the remaining hosts for this loop 18662 1726867316.01903: getting the next task for host managed_node2 18662 1726867316.01907: done getting next task for host managed_node2 18662 1726867316.01910: ^ task is: TASK: meta (flush_handlers) 18662 1726867316.01912: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867316.01914: getting variables 18662 1726867316.01916: in VariableManager get_vars() 18662 1726867316.01937: Calling all_inventory to load vars for managed_node2 18662 1726867316.01939: Calling groups_inventory to load vars for managed_node2 18662 1726867316.01941: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867316.01949: Calling all_plugins_play to load vars for managed_node2 18662 1726867316.01951: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867316.01954: Calling groups_plugins_play to load vars for managed_node2 18662 1726867316.02146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867316.02390: done with get_vars() 18662 1726867316.02400: done getting variables 18662 1726867316.02468: in VariableManager get_vars() 18662 1726867316.02483: Calling all_inventory to load vars for managed_node2 18662 1726867316.02485: Calling groups_inventory to load vars for managed_node2 18662 1726867316.02491: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867316.02496: Calling all_plugins_play to load vars for managed_node2 18662 1726867316.02502: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867316.02505: Calling groups_plugins_play to load vars for managed_node2 18662 1726867316.02663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867316.02896: done with get_vars() 18662 1726867316.02914: done queuing things up, now waiting for results queue to drain 18662 1726867316.02916: results queue empty 18662 1726867316.02917: checking for any_errors_fatal 18662 1726867316.02919: done checking for any_errors_fatal 18662 1726867316.02924: checking for max_fail_percentage 18662 1726867316.02925: done checking for max_fail_percentage 18662 1726867316.02925: checking to see if all hosts have failed and the running result is not ok 18662 1726867316.02926: done checking to see if all hosts have failed 18662 1726867316.02927: getting the remaining hosts for this loop 18662 1726867316.02928: done getting the remaining hosts for this loop 18662 1726867316.02939: getting the next task for host managed_node2 18662 1726867316.02946: done getting next task for host managed_node2 18662 1726867316.02950: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18662 1726867316.02951: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867316.02961: getting variables 18662 1726867316.02962: in VariableManager get_vars() 18662 1726867316.02990: Calling all_inventory to load vars for managed_node2 18662 1726867316.02996: Calling groups_inventory to load vars for managed_node2 18662 1726867316.02999: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867316.03004: Calling all_plugins_play to load vars for managed_node2 18662 1726867316.03006: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867316.03009: Calling groups_plugins_play to load vars for managed_node2 18662 1726867316.03201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867316.03448: done with get_vars() 18662 1726867316.03456: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:21:56 -0400 (0:00:00.918) 0:00:10.670 ****** 18662 1726867316.03525: entering _queue_task() for managed_node2/include_tasks 18662 1726867316.03836: worker is 1 (out of 1 available) 18662 1726867316.03847: exiting _queue_task() for managed_node2/include_tasks 18662 1726867316.03860: done queuing things up, now waiting for results queue to drain 18662 1726867316.03861: waiting for pending results... 18662 1726867316.04158: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18662 1726867316.04242: in run() - task 0affcac9-a3a5-efab-a8ce-000000000019 18662 1726867316.04279: variable 'ansible_search_path' from source: unknown 18662 1726867316.04289: variable 'ansible_search_path' from source: unknown 18662 1726867316.04387: calling self._execute() 18662 1726867316.04580: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867316.04585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867316.04588: variable 'omit' from source: magic vars 18662 1726867316.05111: variable 'ansible_distribution_major_version' from source: facts 18662 1726867316.05129: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867316.05283: _execute() done 18662 1726867316.05286: dumping result to json 18662 1726867316.05289: done dumping result, returning 18662 1726867316.05292: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-efab-a8ce-000000000019] 18662 1726867316.05294: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000019 18662 1726867316.05455: no more pending results, returning what we have 18662 1726867316.05460: in VariableManager get_vars() 18662 1726867316.05504: Calling all_inventory to load vars for managed_node2 18662 1726867316.05507: Calling groups_inventory to load vars for managed_node2 18662 1726867316.05513: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867316.05783: Calling all_plugins_play to load vars for managed_node2 18662 1726867316.05787: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867316.05790: Calling groups_plugins_play to load vars for managed_node2 18662 1726867316.06207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867316.06484: done with get_vars() 18662 1726867316.06493: variable 'ansible_search_path' from source: unknown 18662 1726867316.06498: variable 'ansible_search_path' from source: unknown 18662 1726867316.06514: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000019 18662 1726867316.06517: WORKER PROCESS EXITING 18662 1726867316.06537: we have included files to process 18662 1726867316.06538: generating all_blocks data 18662 1726867316.06540: done generating all_blocks data 18662 1726867316.06541: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18662 1726867316.06542: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18662 1726867316.06544: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18662 1726867316.07286: done processing included file 18662 1726867316.07288: iterating over new_blocks loaded from include file 18662 1726867316.07289: in VariableManager get_vars() 18662 1726867316.07310: done with get_vars() 18662 1726867316.07312: filtering new block on tags 18662 1726867316.07328: done filtering new block on tags 18662 1726867316.07331: in VariableManager get_vars() 18662 1726867316.07348: done with get_vars() 18662 1726867316.07350: filtering new block on tags 18662 1726867316.07371: done filtering new block on tags 18662 1726867316.07374: in VariableManager get_vars() 18662 1726867316.07394: done with get_vars() 18662 1726867316.07396: filtering new block on tags 18662 1726867316.07414: done filtering new block on tags 18662 1726867316.07416: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 18662 1726867316.07421: extending task lists for all hosts with included blocks 18662 1726867316.07814: done extending task lists 18662 1726867316.07815: done processing included files 18662 1726867316.07816: results queue empty 18662 1726867316.07817: checking for any_errors_fatal 18662 1726867316.07818: done checking for any_errors_fatal 18662 1726867316.07819: checking for max_fail_percentage 18662 1726867316.07820: done checking for max_fail_percentage 18662 1726867316.07820: checking to see if all hosts have failed and the running result is not ok 18662 1726867316.07821: done checking to see if all hosts have failed 18662 1726867316.07822: getting the remaining hosts for this loop 18662 1726867316.07823: done getting the remaining hosts for this loop 18662 1726867316.07825: getting the next task for host managed_node2 18662 1726867316.07829: done getting next task for host managed_node2 18662 1726867316.07831: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18662 1726867316.07834: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867316.07842: getting variables 18662 1726867316.07843: in VariableManager get_vars() 18662 1726867316.07856: Calling all_inventory to load vars for managed_node2 18662 1726867316.07858: Calling groups_inventory to load vars for managed_node2 18662 1726867316.07860: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867316.07864: Calling all_plugins_play to load vars for managed_node2 18662 1726867316.07867: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867316.07870: Calling groups_plugins_play to load vars for managed_node2 18662 1726867316.08056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867316.08335: done with get_vars() 18662 1726867316.08344: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:21:56 -0400 (0:00:00.048) 0:00:10.719 ****** 18662 1726867316.08417: entering _queue_task() for managed_node2/setup 18662 1726867316.08950: worker is 1 (out of 1 available) 18662 1726867316.08961: exiting _queue_task() for managed_node2/setup 18662 1726867316.08973: done queuing things up, now waiting for results queue to drain 18662 1726867316.08974: waiting for pending results... 18662 1726867316.09307: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18662 1726867316.09543: in run() - task 0affcac9-a3a5-efab-a8ce-000000000279 18662 1726867316.09561: variable 'ansible_search_path' from source: unknown 18662 1726867316.09567: variable 'ansible_search_path' from source: unknown 18662 1726867316.09618: calling self._execute() 18662 1726867316.09702: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867316.09723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867316.09738: variable 'omit' from source: magic vars 18662 1726867316.10094: variable 'ansible_distribution_major_version' from source: facts 18662 1726867316.10113: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867316.10312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867316.12444: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867316.12581: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867316.12585: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867316.12615: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867316.12644: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867316.12725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867316.12758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867316.12792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867316.12839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867316.12901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867316.12923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867316.12951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867316.12981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867316.13029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867316.13046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867316.13205: variable '__network_required_facts' from source: role '' defaults 18662 1726867316.13226: variable 'ansible_facts' from source: unknown 18662 1726867316.13335: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 18662 1726867316.13385: when evaluation is False, skipping this task 18662 1726867316.13388: _execute() done 18662 1726867316.13390: dumping result to json 18662 1726867316.13392: done dumping result, returning 18662 1726867316.13394: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-efab-a8ce-000000000279] 18662 1726867316.13396: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000279 18662 1726867316.13713: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000279 18662 1726867316.13717: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18662 1726867316.13755: no more pending results, returning what we have 18662 1726867316.13758: results queue empty 18662 1726867316.13759: checking for any_errors_fatal 18662 1726867316.13761: done checking for any_errors_fatal 18662 1726867316.13762: checking for max_fail_percentage 18662 1726867316.13763: done checking for max_fail_percentage 18662 1726867316.13764: checking to see if all hosts have failed and the running result is not ok 18662 1726867316.13765: done checking to see if all hosts have failed 18662 1726867316.13766: getting the remaining hosts for this loop 18662 1726867316.13767: done getting the remaining hosts for this loop 18662 1726867316.13770: getting the next task for host managed_node2 18662 1726867316.13780: done getting next task for host managed_node2 18662 1726867316.13784: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 18662 1726867316.13787: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867316.13801: getting variables 18662 1726867316.13802: in VariableManager get_vars() 18662 1726867316.13840: Calling all_inventory to load vars for managed_node2 18662 1726867316.13843: Calling groups_inventory to load vars for managed_node2 18662 1726867316.13845: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867316.13854: Calling all_plugins_play to load vars for managed_node2 18662 1726867316.13856: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867316.13859: Calling groups_plugins_play to load vars for managed_node2 18662 1726867316.14065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867316.14344: done with get_vars() 18662 1726867316.14354: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:21:56 -0400 (0:00:00.060) 0:00:10.780 ****** 18662 1726867316.14453: entering _queue_task() for managed_node2/stat 18662 1726867316.14696: worker is 1 (out of 1 available) 18662 1726867316.14711: exiting _queue_task() for managed_node2/stat 18662 1726867316.14723: done queuing things up, now waiting for results queue to drain 18662 1726867316.14724: waiting for pending results... 18662 1726867316.14988: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 18662 1726867316.15111: in run() - task 0affcac9-a3a5-efab-a8ce-00000000027b 18662 1726867316.15132: variable 'ansible_search_path' from source: unknown 18662 1726867316.15140: variable 'ansible_search_path' from source: unknown 18662 1726867316.15179: calling self._execute() 18662 1726867316.15262: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867316.15275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867316.15294: variable 'omit' from source: magic vars 18662 1726867316.15736: variable 'ansible_distribution_major_version' from source: facts 18662 1726867316.15752: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867316.15918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867316.16178: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867316.16233: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867316.16301: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867316.16315: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867316.16398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867316.16435: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867316.16468: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867316.16517: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867316.16591: variable '__network_is_ostree' from source: set_fact 18662 1726867316.16603: Evaluated conditional (not __network_is_ostree is defined): False 18662 1726867316.16626: when evaluation is False, skipping this task 18662 1726867316.16629: _execute() done 18662 1726867316.16631: dumping result to json 18662 1726867316.16682: done dumping result, returning 18662 1726867316.16686: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-efab-a8ce-00000000027b] 18662 1726867316.16688: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000027b skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18662 1726867316.16989: no more pending results, returning what we have 18662 1726867316.16992: results queue empty 18662 1726867316.16992: checking for any_errors_fatal 18662 1726867316.16997: done checking for any_errors_fatal 18662 1726867316.16998: checking for max_fail_percentage 18662 1726867316.17000: done checking for max_fail_percentage 18662 1726867316.17000: checking to see if all hosts have failed and the running result is not ok 18662 1726867316.17001: done checking to see if all hosts have failed 18662 1726867316.17001: getting the remaining hosts for this loop 18662 1726867316.17003: done getting the remaining hosts for this loop 18662 1726867316.17005: getting the next task for host managed_node2 18662 1726867316.17012: done getting next task for host managed_node2 18662 1726867316.17015: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18662 1726867316.17017: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867316.17028: getting variables 18662 1726867316.17030: in VariableManager get_vars() 18662 1726867316.17057: Calling all_inventory to load vars for managed_node2 18662 1726867316.17059: Calling groups_inventory to load vars for managed_node2 18662 1726867316.17061: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867316.17068: Calling all_plugins_play to load vars for managed_node2 18662 1726867316.17071: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867316.17073: Calling groups_plugins_play to load vars for managed_node2 18662 1726867316.17287: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000027b 18662 1726867316.17291: WORKER PROCESS EXITING 18662 1726867316.17312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867316.17541: done with get_vars() 18662 1726867316.17549: done getting variables 18662 1726867316.17597: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:21:56 -0400 (0:00:00.031) 0:00:10.811 ****** 18662 1726867316.17625: entering _queue_task() for managed_node2/set_fact 18662 1726867316.17832: worker is 1 (out of 1 available) 18662 1726867316.17843: exiting _queue_task() for managed_node2/set_fact 18662 1726867316.17854: done queuing things up, now waiting for results queue to drain 18662 1726867316.17855: waiting for pending results... 18662 1726867316.18116: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18662 1726867316.18237: in run() - task 0affcac9-a3a5-efab-a8ce-00000000027c 18662 1726867316.18257: variable 'ansible_search_path' from source: unknown 18662 1726867316.18265: variable 'ansible_search_path' from source: unknown 18662 1726867316.18312: calling self._execute() 18662 1726867316.18392: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867316.18406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867316.18426: variable 'omit' from source: magic vars 18662 1726867316.18783: variable 'ansible_distribution_major_version' from source: facts 18662 1726867316.18800: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867316.18968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867316.19238: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867316.19286: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867316.19326: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867316.19397: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867316.19482: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867316.19520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867316.19550: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867316.19584: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867316.19673: variable '__network_is_ostree' from source: set_fact 18662 1726867316.19688: Evaluated conditional (not __network_is_ostree is defined): False 18662 1726867316.19695: when evaluation is False, skipping this task 18662 1726867316.19703: _execute() done 18662 1726867316.19715: dumping result to json 18662 1726867316.19728: done dumping result, returning 18662 1726867316.19740: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-efab-a8ce-00000000027c] 18662 1726867316.19750: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000027c skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18662 1726867316.19883: no more pending results, returning what we have 18662 1726867316.19887: results queue empty 18662 1726867316.19889: checking for any_errors_fatal 18662 1726867316.19895: done checking for any_errors_fatal 18662 1726867316.19896: checking for max_fail_percentage 18662 1726867316.19898: done checking for max_fail_percentage 18662 1726867316.19899: checking to see if all hosts have failed and the running result is not ok 18662 1726867316.19899: done checking to see if all hosts have failed 18662 1726867316.19900: getting the remaining hosts for this loop 18662 1726867316.19902: done getting the remaining hosts for this loop 18662 1726867316.19905: getting the next task for host managed_node2 18662 1726867316.19917: done getting next task for host managed_node2 18662 1726867316.19920: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 18662 1726867316.19923: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867316.19935: getting variables 18662 1726867316.19937: in VariableManager get_vars() 18662 1726867316.19973: Calling all_inventory to load vars for managed_node2 18662 1726867316.19976: Calling groups_inventory to load vars for managed_node2 18662 1726867316.19980: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867316.19990: Calling all_plugins_play to load vars for managed_node2 18662 1726867316.19994: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867316.19997: Calling groups_plugins_play to load vars for managed_node2 18662 1726867316.20401: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000027c 18662 1726867316.20404: WORKER PROCESS EXITING 18662 1726867316.20428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867316.20691: done with get_vars() 18662 1726867316.20700: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:21:56 -0400 (0:00:00.031) 0:00:10.843 ****** 18662 1726867316.20790: entering _queue_task() for managed_node2/service_facts 18662 1726867316.20792: Creating lock for service_facts 18662 1726867316.21023: worker is 1 (out of 1 available) 18662 1726867316.21035: exiting _queue_task() for managed_node2/service_facts 18662 1726867316.21047: done queuing things up, now waiting for results queue to drain 18662 1726867316.21049: waiting for pending results... 18662 1726867316.21298: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 18662 1726867316.21420: in run() - task 0affcac9-a3a5-efab-a8ce-00000000027e 18662 1726867316.21440: variable 'ansible_search_path' from source: unknown 18662 1726867316.21484: variable 'ansible_search_path' from source: unknown 18662 1726867316.21488: calling self._execute() 18662 1726867316.21572: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867316.21587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867316.21604: variable 'omit' from source: magic vars 18662 1726867316.21969: variable 'ansible_distribution_major_version' from source: facts 18662 1726867316.22047: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867316.22051: variable 'omit' from source: magic vars 18662 1726867316.22057: variable 'omit' from source: magic vars 18662 1726867316.22097: variable 'omit' from source: magic vars 18662 1726867316.22141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867316.22186: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867316.22215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867316.22238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867316.22253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867316.22293: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867316.22482: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867316.22485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867316.22488: Set connection var ansible_timeout to 10 18662 1726867316.22490: Set connection var ansible_connection to ssh 18662 1726867316.22492: Set connection var ansible_shell_executable to /bin/sh 18662 1726867316.22494: Set connection var ansible_shell_type to sh 18662 1726867316.22496: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867316.22498: Set connection var ansible_pipelining to False 18662 1726867316.22500: variable 'ansible_shell_executable' from source: unknown 18662 1726867316.22502: variable 'ansible_connection' from source: unknown 18662 1726867316.22505: variable 'ansible_module_compression' from source: unknown 18662 1726867316.22507: variable 'ansible_shell_type' from source: unknown 18662 1726867316.22512: variable 'ansible_shell_executable' from source: unknown 18662 1726867316.22514: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867316.22516: variable 'ansible_pipelining' from source: unknown 18662 1726867316.22518: variable 'ansible_timeout' from source: unknown 18662 1726867316.22520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867316.22719: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867316.22736: variable 'omit' from source: magic vars 18662 1726867316.22752: starting attempt loop 18662 1726867316.22761: running the handler 18662 1726867316.22781: _low_level_execute_command(): starting 18662 1726867316.22795: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867316.23596: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867316.23634: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867316.23661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867316.23754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867316.25427: stdout chunk (state=3): >>>/root <<< 18662 1726867316.25569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867316.25588: stdout chunk (state=3): >>><<< 18662 1726867316.25600: stderr chunk (state=3): >>><<< 18662 1726867316.25701: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867316.25705: _low_level_execute_command(): starting 18662 1726867316.25707: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867316.256213-19173-184180700604545 `" && echo ansible-tmp-1726867316.256213-19173-184180700604545="` echo /root/.ansible/tmp/ansible-tmp-1726867316.256213-19173-184180700604545 `" ) && sleep 0' 18662 1726867316.26232: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867316.26245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867316.26259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867316.26298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867316.26332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18662 1726867316.26412: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867316.26424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867316.26452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867316.26521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867316.28421: stdout chunk (state=3): >>>ansible-tmp-1726867316.256213-19173-184180700604545=/root/.ansible/tmp/ansible-tmp-1726867316.256213-19173-184180700604545 <<< 18662 1726867316.28564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867316.28597: stdout chunk (state=3): >>><<< 18662 1726867316.28601: stderr chunk (state=3): >>><<< 18662 1726867316.28783: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867316.256213-19173-184180700604545=/root/.ansible/tmp/ansible-tmp-1726867316.256213-19173-184180700604545 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867316.28786: variable 'ansible_module_compression' from source: unknown 18662 1726867316.28789: ANSIBALLZ: Using lock for service_facts 18662 1726867316.28791: ANSIBALLZ: Acquiring lock 18662 1726867316.28794: ANSIBALLZ: Lock acquired: 140264018535792 18662 1726867316.28796: ANSIBALLZ: Creating module 18662 1726867316.46458: ANSIBALLZ: Writing module into payload 18662 1726867316.46564: ANSIBALLZ: Writing module 18662 1726867316.46604: ANSIBALLZ: Renaming module 18662 1726867316.46617: ANSIBALLZ: Done creating module 18662 1726867316.46638: variable 'ansible_facts' from source: unknown 18662 1726867316.46724: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867316.256213-19173-184180700604545/AnsiballZ_service_facts.py 18662 1726867316.46909: Sending initial data 18662 1726867316.46913: Sent initial data (161 bytes) 18662 1726867316.47572: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867316.47604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867316.47620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867316.47658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867316.47759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867316.49390: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867316.49456: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867316.49510: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpumd9htum /root/.ansible/tmp/ansible-tmp-1726867316.256213-19173-184180700604545/AnsiballZ_service_facts.py <<< 18662 1726867316.49517: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867316.256213-19173-184180700604545/AnsiballZ_service_facts.py" <<< 18662 1726867316.49553: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpumd9htum" to remote "/root/.ansible/tmp/ansible-tmp-1726867316.256213-19173-184180700604545/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867316.256213-19173-184180700604545/AnsiballZ_service_facts.py" <<< 18662 1726867316.50436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867316.50440: stdout chunk (state=3): >>><<< 18662 1726867316.50442: stderr chunk (state=3): >>><<< 18662 1726867316.50444: done transferring module to remote 18662 1726867316.50446: _low_level_execute_command(): starting 18662 1726867316.50448: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867316.256213-19173-184180700604545/ /root/.ansible/tmp/ansible-tmp-1726867316.256213-19173-184180700604545/AnsiballZ_service_facts.py && sleep 0' 18662 1726867316.51119: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867316.51285: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867316.51301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867316.51324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867316.51466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867316.53685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867316.53691: stdout chunk (state=3): >>><<< 18662 1726867316.53693: stderr chunk (state=3): >>><<< 18662 1726867316.53696: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867316.53698: _low_level_execute_command(): starting 18662 1726867316.53701: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867316.256213-19173-184180700604545/AnsiballZ_service_facts.py && sleep 0' 18662 1726867316.54697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867316.55031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867316.55141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867318.14226: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 18662 1726867318.14286: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 18662 1726867318.15793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867318.15849: stderr chunk (state=3): >>><<< 18662 1726867318.15852: stdout chunk (state=3): >>><<< 18662 1726867318.16085: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867318.18368: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867316.256213-19173-184180700604545/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867318.18424: _low_level_execute_command(): starting 18662 1726867318.18436: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867316.256213-19173-184180700604545/ > /dev/null 2>&1 && sleep 0' 18662 1726867318.19094: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867318.19152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867318.19171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867318.19191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867318.19267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867318.21351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867318.21361: stdout chunk (state=3): >>><<< 18662 1726867318.21372: stderr chunk (state=3): >>><<< 18662 1726867318.21393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867318.21404: handler run complete 18662 1726867318.21805: variable 'ansible_facts' from source: unknown 18662 1726867318.22132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867318.22622: variable 'ansible_facts' from source: unknown 18662 1726867318.22758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867318.22972: attempt loop complete, returning result 18662 1726867318.22975: _execute() done 18662 1726867318.22980: dumping result to json 18662 1726867318.23045: done dumping result, returning 18662 1726867318.23053: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-efab-a8ce-00000000027e] 18662 1726867318.23058: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000027e 18662 1726867318.24205: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000027e 18662 1726867318.24208: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18662 1726867318.24339: no more pending results, returning what we have 18662 1726867318.24342: results queue empty 18662 1726867318.24343: checking for any_errors_fatal 18662 1726867318.24347: done checking for any_errors_fatal 18662 1726867318.24347: checking for max_fail_percentage 18662 1726867318.24349: done checking for max_fail_percentage 18662 1726867318.24350: checking to see if all hosts have failed and the running result is not ok 18662 1726867318.24351: done checking to see if all hosts have failed 18662 1726867318.24351: getting the remaining hosts for this loop 18662 1726867318.24352: done getting the remaining hosts for this loop 18662 1726867318.24355: getting the next task for host managed_node2 18662 1726867318.24361: done getting next task for host managed_node2 18662 1726867318.24364: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 18662 1726867318.24367: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867318.24379: getting variables 18662 1726867318.24380: in VariableManager get_vars() 18662 1726867318.24612: Calling all_inventory to load vars for managed_node2 18662 1726867318.24615: Calling groups_inventory to load vars for managed_node2 18662 1726867318.24618: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867318.24626: Calling all_plugins_play to load vars for managed_node2 18662 1726867318.24629: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867318.24632: Calling groups_plugins_play to load vars for managed_node2 18662 1726867318.25552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867318.26558: done with get_vars() 18662 1726867318.26571: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:21:58 -0400 (0:00:02.060) 0:00:12.903 ****** 18662 1726867318.26854: entering _queue_task() for managed_node2/package_facts 18662 1726867318.26856: Creating lock for package_facts 18662 1726867318.27170: worker is 1 (out of 1 available) 18662 1726867318.27185: exiting _queue_task() for managed_node2/package_facts 18662 1726867318.27199: done queuing things up, now waiting for results queue to drain 18662 1726867318.27200: waiting for pending results... 18662 1726867318.27452: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 18662 1726867318.27674: in run() - task 0affcac9-a3a5-efab-a8ce-00000000027f 18662 1726867318.27679: variable 'ansible_search_path' from source: unknown 18662 1726867318.27683: variable 'ansible_search_path' from source: unknown 18662 1726867318.27686: calling self._execute() 18662 1726867318.27737: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867318.27750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867318.27764: variable 'omit' from source: magic vars 18662 1726867318.28150: variable 'ansible_distribution_major_version' from source: facts 18662 1726867318.28167: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867318.28180: variable 'omit' from source: magic vars 18662 1726867318.28245: variable 'omit' from source: magic vars 18662 1726867318.28285: variable 'omit' from source: magic vars 18662 1726867318.28335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867318.28374: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867318.28401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867318.28432: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867318.28449: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867318.28544: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867318.28548: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867318.28550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867318.28616: Set connection var ansible_timeout to 10 18662 1726867318.28625: Set connection var ansible_connection to ssh 18662 1726867318.28637: Set connection var ansible_shell_executable to /bin/sh 18662 1726867318.28644: Set connection var ansible_shell_type to sh 18662 1726867318.28664: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867318.28760: Set connection var ansible_pipelining to False 18662 1726867318.28763: variable 'ansible_shell_executable' from source: unknown 18662 1726867318.28766: variable 'ansible_connection' from source: unknown 18662 1726867318.28768: variable 'ansible_module_compression' from source: unknown 18662 1726867318.28770: variable 'ansible_shell_type' from source: unknown 18662 1726867318.28772: variable 'ansible_shell_executable' from source: unknown 18662 1726867318.28774: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867318.28776: variable 'ansible_pipelining' from source: unknown 18662 1726867318.28780: variable 'ansible_timeout' from source: unknown 18662 1726867318.28782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867318.29045: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867318.29050: variable 'omit' from source: magic vars 18662 1726867318.29053: starting attempt loop 18662 1726867318.29055: running the handler 18662 1726867318.29057: _low_level_execute_command(): starting 18662 1726867318.29060: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867318.30231: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867318.30247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867318.30265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867318.30347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867318.30398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867318.30422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867318.30451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867318.30520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867318.32212: stdout chunk (state=3): >>>/root <<< 18662 1726867318.32345: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867318.32348: stdout chunk (state=3): >>><<< 18662 1726867318.32351: stderr chunk (state=3): >>><<< 18662 1726867318.32359: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867318.32370: _low_level_execute_command(): starting 18662 1726867318.32375: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867318.3235905-19268-16096660467748 `" && echo ansible-tmp-1726867318.3235905-19268-16096660467748="` echo /root/.ansible/tmp/ansible-tmp-1726867318.3235905-19268-16096660467748 `" ) && sleep 0' 18662 1726867318.32794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867318.32801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867318.32813: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 18662 1726867318.32817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867318.32819: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867318.32858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867318.32863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867318.32911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867318.34852: stdout chunk (state=3): >>>ansible-tmp-1726867318.3235905-19268-16096660467748=/root/.ansible/tmp/ansible-tmp-1726867318.3235905-19268-16096660467748 <<< 18662 1726867318.34963: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867318.34987: stderr chunk (state=3): >>><<< 18662 1726867318.34994: stdout chunk (state=3): >>><<< 18662 1726867318.35012: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867318.3235905-19268-16096660467748=/root/.ansible/tmp/ansible-tmp-1726867318.3235905-19268-16096660467748 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867318.35043: variable 'ansible_module_compression' from source: unknown 18662 1726867318.35080: ANSIBALLZ: Using lock for package_facts 18662 1726867318.35084: ANSIBALLZ: Acquiring lock 18662 1726867318.35086: ANSIBALLZ: Lock acquired: 140264018531712 18662 1726867318.35089: ANSIBALLZ: Creating module 18662 1726867318.71204: ANSIBALLZ: Writing module into payload 18662 1726867318.71334: ANSIBALLZ: Writing module 18662 1726867318.71362: ANSIBALLZ: Renaming module 18662 1726867318.71382: ANSIBALLZ: Done creating module 18662 1726867318.71484: variable 'ansible_facts' from source: unknown 18662 1726867318.71642: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867318.3235905-19268-16096660467748/AnsiballZ_package_facts.py 18662 1726867318.71831: Sending initial data 18662 1726867318.71835: Sent initial data (161 bytes) 18662 1726867318.72517: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867318.72591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867318.72739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867318.72797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867318.74458: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867318.74492: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867318.74542: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpeazus0u7 /root/.ansible/tmp/ansible-tmp-1726867318.3235905-19268-16096660467748/AnsiballZ_package_facts.py <<< 18662 1726867318.74546: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867318.3235905-19268-16096660467748/AnsiballZ_package_facts.py" <<< 18662 1726867318.74635: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpeazus0u7" to remote "/root/.ansible/tmp/ansible-tmp-1726867318.3235905-19268-16096660467748/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867318.3235905-19268-16096660467748/AnsiballZ_package_facts.py" <<< 18662 1726867318.76803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867318.76832: stderr chunk (state=3): >>><<< 18662 1726867318.76843: stdout chunk (state=3): >>><<< 18662 1726867318.76986: done transferring module to remote 18662 1726867318.76989: _low_level_execute_command(): starting 18662 1726867318.76992: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867318.3235905-19268-16096660467748/ /root/.ansible/tmp/ansible-tmp-1726867318.3235905-19268-16096660467748/AnsiballZ_package_facts.py && sleep 0' 18662 1726867318.77495: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867318.77513: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867318.77592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867318.77629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867318.77645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867318.77667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867318.77741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867318.79591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867318.79603: stdout chunk (state=3): >>><<< 18662 1726867318.79624: stderr chunk (state=3): >>><<< 18662 1726867318.79715: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867318.79719: _low_level_execute_command(): starting 18662 1726867318.79721: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867318.3235905-19268-16096660467748/AnsiballZ_package_facts.py && sleep 0' 18662 1726867318.80242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867318.80255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867318.80268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867318.80287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867318.80301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867318.80335: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867318.80350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867318.80391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867318.80446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867318.80464: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867318.80485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867318.80564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867319.25485: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 18662 1726867319.25497: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 18662 1726867319.25531: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 18662 1726867319.25590: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 18662 1726867319.25597: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 18662 1726867319.25649: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 18662 1726867319.25694: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 18662 1726867319.27494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867319.27544: stderr chunk (state=3): >>><<< 18662 1726867319.27560: stdout chunk (state=3): >>><<< 18662 1726867319.27794: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867319.33572: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867318.3235905-19268-16096660467748/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867319.33609: _low_level_execute_command(): starting 18662 1726867319.33620: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867318.3235905-19268-16096660467748/ > /dev/null 2>&1 && sleep 0' 18662 1726867319.34266: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867319.34285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867319.34298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867319.34315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867319.34341: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867319.34443: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867319.34468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867319.34551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867319.36496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867319.36512: stdout chunk (state=3): >>><<< 18662 1726867319.36528: stderr chunk (state=3): >>><<< 18662 1726867319.36583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867319.36586: handler run complete 18662 1726867319.37398: variable 'ansible_facts' from source: unknown 18662 1726867319.37829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867319.39966: variable 'ansible_facts' from source: unknown 18662 1726867319.40423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867319.41179: attempt loop complete, returning result 18662 1726867319.41197: _execute() done 18662 1726867319.41224: dumping result to json 18662 1726867319.41433: done dumping result, returning 18662 1726867319.41483: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-efab-a8ce-00000000027f] 18662 1726867319.41487: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000027f 18662 1726867319.43704: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000027f 18662 1726867319.43708: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18662 1726867319.43810: no more pending results, returning what we have 18662 1726867319.43813: results queue empty 18662 1726867319.43814: checking for any_errors_fatal 18662 1726867319.43819: done checking for any_errors_fatal 18662 1726867319.43820: checking for max_fail_percentage 18662 1726867319.43821: done checking for max_fail_percentage 18662 1726867319.43822: checking to see if all hosts have failed and the running result is not ok 18662 1726867319.43823: done checking to see if all hosts have failed 18662 1726867319.43823: getting the remaining hosts for this loop 18662 1726867319.43825: done getting the remaining hosts for this loop 18662 1726867319.43828: getting the next task for host managed_node2 18662 1726867319.43834: done getting next task for host managed_node2 18662 1726867319.43838: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18662 1726867319.43840: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867319.43849: getting variables 18662 1726867319.43850: in VariableManager get_vars() 18662 1726867319.43887: Calling all_inventory to load vars for managed_node2 18662 1726867319.43890: Calling groups_inventory to load vars for managed_node2 18662 1726867319.43893: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867319.43901: Calling all_plugins_play to load vars for managed_node2 18662 1726867319.43904: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867319.43907: Calling groups_plugins_play to load vars for managed_node2 18662 1726867319.45179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867319.46824: done with get_vars() 18662 1726867319.46844: done getting variables 18662 1726867319.46913: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:21:59 -0400 (0:00:01.201) 0:00:14.104 ****** 18662 1726867319.46942: entering _queue_task() for managed_node2/debug 18662 1726867319.47251: worker is 1 (out of 1 available) 18662 1726867319.47264: exiting _queue_task() for managed_node2/debug 18662 1726867319.47279: done queuing things up, now waiting for results queue to drain 18662 1726867319.47281: waiting for pending results... 18662 1726867319.47609: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 18662 1726867319.47660: in run() - task 0affcac9-a3a5-efab-a8ce-00000000001a 18662 1726867319.47683: variable 'ansible_search_path' from source: unknown 18662 1726867319.47693: variable 'ansible_search_path' from source: unknown 18662 1726867319.47742: calling self._execute() 18662 1726867319.47838: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867319.47851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867319.47866: variable 'omit' from source: magic vars 18662 1726867319.48362: variable 'ansible_distribution_major_version' from source: facts 18662 1726867319.48366: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867319.48369: variable 'omit' from source: magic vars 18662 1726867319.48372: variable 'omit' from source: magic vars 18662 1726867319.48426: variable 'network_provider' from source: set_fact 18662 1726867319.48448: variable 'omit' from source: magic vars 18662 1726867319.48501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867319.48541: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867319.48566: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867319.48599: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867319.48614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867319.48650: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867319.48660: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867319.48669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867319.48776: Set connection var ansible_timeout to 10 18662 1726867319.48796: Set connection var ansible_connection to ssh 18662 1726867319.48809: Set connection var ansible_shell_executable to /bin/sh 18662 1726867319.48816: Set connection var ansible_shell_type to sh 18662 1726867319.48831: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867319.48882: Set connection var ansible_pipelining to False 18662 1726867319.48886: variable 'ansible_shell_executable' from source: unknown 18662 1726867319.48888: variable 'ansible_connection' from source: unknown 18662 1726867319.48891: variable 'ansible_module_compression' from source: unknown 18662 1726867319.48893: variable 'ansible_shell_type' from source: unknown 18662 1726867319.48903: variable 'ansible_shell_executable' from source: unknown 18662 1726867319.48908: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867319.48914: variable 'ansible_pipelining' from source: unknown 18662 1726867319.48922: variable 'ansible_timeout' from source: unknown 18662 1726867319.48932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867319.49076: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867319.49125: variable 'omit' from source: magic vars 18662 1726867319.49128: starting attempt loop 18662 1726867319.49131: running the handler 18662 1726867319.49162: handler run complete 18662 1726867319.49233: attempt loop complete, returning result 18662 1726867319.49237: _execute() done 18662 1726867319.49240: dumping result to json 18662 1726867319.49242: done dumping result, returning 18662 1726867319.49244: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-efab-a8ce-00000000001a] 18662 1726867319.49247: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000001a ok: [managed_node2] => {} MSG: Using network provider: nm 18662 1726867319.49388: no more pending results, returning what we have 18662 1726867319.49392: results queue empty 18662 1726867319.49393: checking for any_errors_fatal 18662 1726867319.49404: done checking for any_errors_fatal 18662 1726867319.49405: checking for max_fail_percentage 18662 1726867319.49407: done checking for max_fail_percentage 18662 1726867319.49408: checking to see if all hosts have failed and the running result is not ok 18662 1726867319.49408: done checking to see if all hosts have failed 18662 1726867319.49409: getting the remaining hosts for this loop 18662 1726867319.49410: done getting the remaining hosts for this loop 18662 1726867319.49414: getting the next task for host managed_node2 18662 1726867319.49421: done getting next task for host managed_node2 18662 1726867319.49425: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18662 1726867319.49427: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867319.49437: getting variables 18662 1726867319.49439: in VariableManager get_vars() 18662 1726867319.49473: Calling all_inventory to load vars for managed_node2 18662 1726867319.49476: Calling groups_inventory to load vars for managed_node2 18662 1726867319.49480: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867319.49491: Calling all_plugins_play to load vars for managed_node2 18662 1726867319.49494: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867319.49497: Calling groups_plugins_play to load vars for managed_node2 18662 1726867319.50193: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000001a 18662 1726867319.50197: WORKER PROCESS EXITING 18662 1726867319.51174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867319.52734: done with get_vars() 18662 1726867319.52755: done getting variables 18662 1726867319.52816: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:21:59 -0400 (0:00:00.059) 0:00:14.163 ****** 18662 1726867319.52845: entering _queue_task() for managed_node2/fail 18662 1726867319.53306: worker is 1 (out of 1 available) 18662 1726867319.53320: exiting _queue_task() for managed_node2/fail 18662 1726867319.53331: done queuing things up, now waiting for results queue to drain 18662 1726867319.53333: waiting for pending results... 18662 1726867319.53433: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18662 1726867319.53556: in run() - task 0affcac9-a3a5-efab-a8ce-00000000001b 18662 1726867319.53580: variable 'ansible_search_path' from source: unknown 18662 1726867319.53590: variable 'ansible_search_path' from source: unknown 18662 1726867319.53634: calling self._execute() 18662 1726867319.53727: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867319.53783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867319.53787: variable 'omit' from source: magic vars 18662 1726867319.54156: variable 'ansible_distribution_major_version' from source: facts 18662 1726867319.54174: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867319.54313: variable 'network_state' from source: role '' defaults 18662 1726867319.54335: Evaluated conditional (network_state != {}): False 18662 1726867319.54437: when evaluation is False, skipping this task 18662 1726867319.54440: _execute() done 18662 1726867319.54443: dumping result to json 18662 1726867319.54445: done dumping result, returning 18662 1726867319.54448: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-efab-a8ce-00000000001b] 18662 1726867319.54451: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000001b 18662 1726867319.54527: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000001b 18662 1726867319.54531: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18662 1726867319.54591: no more pending results, returning what we have 18662 1726867319.54596: results queue empty 18662 1726867319.54598: checking for any_errors_fatal 18662 1726867319.54604: done checking for any_errors_fatal 18662 1726867319.54605: checking for max_fail_percentage 18662 1726867319.54607: done checking for max_fail_percentage 18662 1726867319.54611: checking to see if all hosts have failed and the running result is not ok 18662 1726867319.54612: done checking to see if all hosts have failed 18662 1726867319.54613: getting the remaining hosts for this loop 18662 1726867319.54614: done getting the remaining hosts for this loop 18662 1726867319.54618: getting the next task for host managed_node2 18662 1726867319.54626: done getting next task for host managed_node2 18662 1726867319.54629: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18662 1726867319.54632: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867319.54647: getting variables 18662 1726867319.54649: in VariableManager get_vars() 18662 1726867319.54691: Calling all_inventory to load vars for managed_node2 18662 1726867319.54694: Calling groups_inventory to load vars for managed_node2 18662 1726867319.54697: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867319.54714: Calling all_plugins_play to load vars for managed_node2 18662 1726867319.54717: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867319.54720: Calling groups_plugins_play to load vars for managed_node2 18662 1726867319.56187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867319.57881: done with get_vars() 18662 1726867319.57903: done getting variables 18662 1726867319.57964: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:21:59 -0400 (0:00:00.051) 0:00:14.215 ****** 18662 1726867319.57996: entering _queue_task() for managed_node2/fail 18662 1726867319.58279: worker is 1 (out of 1 available) 18662 1726867319.58291: exiting _queue_task() for managed_node2/fail 18662 1726867319.58302: done queuing things up, now waiting for results queue to drain 18662 1726867319.58303: waiting for pending results... 18662 1726867319.58696: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18662 1726867319.58701: in run() - task 0affcac9-a3a5-efab-a8ce-00000000001c 18662 1726867319.58704: variable 'ansible_search_path' from source: unknown 18662 1726867319.58706: variable 'ansible_search_path' from source: unknown 18662 1726867319.58734: calling self._execute() 18662 1726867319.58823: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867319.58836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867319.58851: variable 'omit' from source: magic vars 18662 1726867319.59259: variable 'ansible_distribution_major_version' from source: facts 18662 1726867319.59280: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867319.59452: variable 'network_state' from source: role '' defaults 18662 1726867319.59472: Evaluated conditional (network_state != {}): False 18662 1726867319.59485: when evaluation is False, skipping this task 18662 1726867319.59493: _execute() done 18662 1726867319.59500: dumping result to json 18662 1726867319.59511: done dumping result, returning 18662 1726867319.59525: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-efab-a8ce-00000000001c] 18662 1726867319.59535: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000001c skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18662 1726867319.59731: no more pending results, returning what we have 18662 1726867319.59736: results queue empty 18662 1726867319.59737: checking for any_errors_fatal 18662 1726867319.59746: done checking for any_errors_fatal 18662 1726867319.59747: checking for max_fail_percentage 18662 1726867319.59750: done checking for max_fail_percentage 18662 1726867319.59751: checking to see if all hosts have failed and the running result is not ok 18662 1726867319.59752: done checking to see if all hosts have failed 18662 1726867319.59752: getting the remaining hosts for this loop 18662 1726867319.59754: done getting the remaining hosts for this loop 18662 1726867319.59758: getting the next task for host managed_node2 18662 1726867319.59766: done getting next task for host managed_node2 18662 1726867319.59770: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18662 1726867319.59772: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867319.59790: getting variables 18662 1726867319.59792: in VariableManager get_vars() 18662 1726867319.59836: Calling all_inventory to load vars for managed_node2 18662 1726867319.59839: Calling groups_inventory to load vars for managed_node2 18662 1726867319.59842: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867319.59853: Calling all_plugins_play to load vars for managed_node2 18662 1726867319.59856: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867319.59858: Calling groups_plugins_play to load vars for managed_node2 18662 1726867319.60495: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000001c 18662 1726867319.60499: WORKER PROCESS EXITING 18662 1726867319.62063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867319.66045: done with get_vars() 18662 1726867319.66076: done getting variables 18662 1726867319.66475: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:21:59 -0400 (0:00:00.085) 0:00:14.300 ****** 18662 1726867319.66513: entering _queue_task() for managed_node2/fail 18662 1726867319.67313: worker is 1 (out of 1 available) 18662 1726867319.67326: exiting _queue_task() for managed_node2/fail 18662 1726867319.67562: done queuing things up, now waiting for results queue to drain 18662 1726867319.67564: waiting for pending results... 18662 1726867319.67982: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18662 1726867319.68098: in run() - task 0affcac9-a3a5-efab-a8ce-00000000001d 18662 1726867319.68226: variable 'ansible_search_path' from source: unknown 18662 1726867319.68258: variable 'ansible_search_path' from source: unknown 18662 1726867319.68301: calling self._execute() 18662 1726867319.68503: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867319.68557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867319.68591: variable 'omit' from source: magic vars 18662 1726867319.69685: variable 'ansible_distribution_major_version' from source: facts 18662 1726867319.69688: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867319.70140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867319.73424: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867319.73502: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867319.73550: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867319.73590: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867319.73622: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867319.73705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867319.73747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867319.73776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867319.73825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867319.73847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867319.73947: variable 'ansible_distribution_major_version' from source: facts 18662 1726867319.73971: Evaluated conditional (ansible_distribution_major_version | int > 9): True 18662 1726867319.74089: variable 'ansible_distribution' from source: facts 18662 1726867319.74098: variable '__network_rh_distros' from source: role '' defaults 18662 1726867319.74114: Evaluated conditional (ansible_distribution in __network_rh_distros): True 18662 1726867319.74372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867319.74407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867319.74440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867319.74484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867319.74507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867319.74559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867319.74590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867319.74725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867319.74728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867319.74730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867319.74732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867319.74750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867319.74772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867319.74818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867319.74840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867319.75159: variable 'network_connections' from source: play vars 18662 1726867319.75173: variable 'interface' from source: set_fact 18662 1726867319.75245: variable 'interface' from source: set_fact 18662 1726867319.75263: variable 'interface' from source: set_fact 18662 1726867319.75330: variable 'interface' from source: set_fact 18662 1726867319.75345: variable 'network_state' from source: role '' defaults 18662 1726867319.75413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867319.75576: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867319.75635: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867319.75670: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867319.75711: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867319.75757: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867319.75810: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867319.75829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867319.75917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867319.75921: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 18662 1726867319.75923: when evaluation is False, skipping this task 18662 1726867319.75925: _execute() done 18662 1726867319.75927: dumping result to json 18662 1726867319.75930: done dumping result, returning 18662 1726867319.75934: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-efab-a8ce-00000000001d] 18662 1726867319.75943: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000001d skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 18662 1726867319.76150: no more pending results, returning what we have 18662 1726867319.76155: results queue empty 18662 1726867319.76156: checking for any_errors_fatal 18662 1726867319.76165: done checking for any_errors_fatal 18662 1726867319.76166: checking for max_fail_percentage 18662 1726867319.76169: done checking for max_fail_percentage 18662 1726867319.76170: checking to see if all hosts have failed and the running result is not ok 18662 1726867319.76170: done checking to see if all hosts have failed 18662 1726867319.76171: getting the remaining hosts for this loop 18662 1726867319.76173: done getting the remaining hosts for this loop 18662 1726867319.76179: getting the next task for host managed_node2 18662 1726867319.76186: done getting next task for host managed_node2 18662 1726867319.76191: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18662 1726867319.76193: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867319.76206: getting variables 18662 1726867319.76211: in VariableManager get_vars() 18662 1726867319.76248: Calling all_inventory to load vars for managed_node2 18662 1726867319.76251: Calling groups_inventory to load vars for managed_node2 18662 1726867319.76254: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867319.76265: Calling all_plugins_play to load vars for managed_node2 18662 1726867319.76268: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867319.76271: Calling groups_plugins_play to load vars for managed_node2 18662 1726867319.77194: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000001d 18662 1726867319.77198: WORKER PROCESS EXITING 18662 1726867319.77974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867319.80456: done with get_vars() 18662 1726867319.80479: done getting variables 18662 1726867319.80572: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:21:59 -0400 (0:00:00.140) 0:00:14.441 ****** 18662 1726867319.80602: entering _queue_task() for managed_node2/dnf 18662 1726867319.80858: worker is 1 (out of 1 available) 18662 1726867319.80870: exiting _queue_task() for managed_node2/dnf 18662 1726867319.80984: done queuing things up, now waiting for results queue to drain 18662 1726867319.80986: waiting for pending results... 18662 1726867319.81133: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18662 1726867319.81239: in run() - task 0affcac9-a3a5-efab-a8ce-00000000001e 18662 1726867319.81256: variable 'ansible_search_path' from source: unknown 18662 1726867319.81263: variable 'ansible_search_path' from source: unknown 18662 1726867319.81303: calling self._execute() 18662 1726867319.81386: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867319.81398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867319.81414: variable 'omit' from source: magic vars 18662 1726867319.81771: variable 'ansible_distribution_major_version' from source: facts 18662 1726867319.81790: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867319.81991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867319.84217: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867319.84291: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867319.84338: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867319.84375: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867319.84413: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867319.84493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867319.84532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867319.84561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867319.84606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867319.84633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867319.84753: variable 'ansible_distribution' from source: facts 18662 1726867319.84764: variable 'ansible_distribution_major_version' from source: facts 18662 1726867319.84785: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 18662 1726867319.84901: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867319.85039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867319.85072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867319.85103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867319.85149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867319.85174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867319.85217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867319.85243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867319.85273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867319.85321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867319.85338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867319.85383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867319.85416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867319.85444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867319.85483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867319.85582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867319.85846: variable 'network_connections' from source: play vars 18662 1726867319.85954: variable 'interface' from source: set_fact 18662 1726867319.85957: variable 'interface' from source: set_fact 18662 1726867319.85960: variable 'interface' from source: set_fact 18662 1726867319.86185: variable 'interface' from source: set_fact 18662 1726867319.86258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867319.86663: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867319.86710: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867319.86760: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867319.87043: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867319.87046: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867319.87049: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867319.87082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867319.87181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867319.87240: variable '__network_team_connections_defined' from source: role '' defaults 18662 1726867319.87821: variable 'network_connections' from source: play vars 18662 1726867319.87832: variable 'interface' from source: set_fact 18662 1726867319.87926: variable 'interface' from source: set_fact 18662 1726867319.88127: variable 'interface' from source: set_fact 18662 1726867319.88130: variable 'interface' from source: set_fact 18662 1726867319.88132: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18662 1726867319.88134: when evaluation is False, skipping this task 18662 1726867319.88136: _execute() done 18662 1726867319.88281: dumping result to json 18662 1726867319.88285: done dumping result, returning 18662 1726867319.88287: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-efab-a8ce-00000000001e] 18662 1726867319.88289: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000001e skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18662 1726867319.88414: no more pending results, returning what we have 18662 1726867319.88418: results queue empty 18662 1726867319.88419: checking for any_errors_fatal 18662 1726867319.88428: done checking for any_errors_fatal 18662 1726867319.88429: checking for max_fail_percentage 18662 1726867319.88432: done checking for max_fail_percentage 18662 1726867319.88433: checking to see if all hosts have failed and the running result is not ok 18662 1726867319.88434: done checking to see if all hosts have failed 18662 1726867319.88434: getting the remaining hosts for this loop 18662 1726867319.88436: done getting the remaining hosts for this loop 18662 1726867319.88439: getting the next task for host managed_node2 18662 1726867319.88447: done getting next task for host managed_node2 18662 1726867319.88451: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18662 1726867319.88454: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867319.88467: getting variables 18662 1726867319.88469: in VariableManager get_vars() 18662 1726867319.88507: Calling all_inventory to load vars for managed_node2 18662 1726867319.88512: Calling groups_inventory to load vars for managed_node2 18662 1726867319.88515: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867319.88525: Calling all_plugins_play to load vars for managed_node2 18662 1726867319.88527: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867319.88530: Calling groups_plugins_play to load vars for managed_node2 18662 1726867319.89385: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000001e 18662 1726867319.89388: WORKER PROCESS EXITING 18662 1726867319.91567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867319.94859: done with get_vars() 18662 1726867319.94889: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18662 1726867319.94966: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:21:59 -0400 (0:00:00.146) 0:00:14.587 ****** 18662 1726867319.95245: entering _queue_task() for managed_node2/yum 18662 1726867319.95248: Creating lock for yum 18662 1726867319.95893: worker is 1 (out of 1 available) 18662 1726867319.95906: exiting _queue_task() for managed_node2/yum 18662 1726867319.95923: done queuing things up, now waiting for results queue to drain 18662 1726867319.95924: waiting for pending results... 18662 1726867319.96536: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18662 1726867319.96769: in run() - task 0affcac9-a3a5-efab-a8ce-00000000001f 18662 1726867319.96784: variable 'ansible_search_path' from source: unknown 18662 1726867319.96788: variable 'ansible_search_path' from source: unknown 18662 1726867319.96876: calling self._execute() 18662 1726867319.96924: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867319.96937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867319.96955: variable 'omit' from source: magic vars 18662 1726867319.97323: variable 'ansible_distribution_major_version' from source: facts 18662 1726867319.97339: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867319.97518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867320.00815: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867320.01248: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867320.01347: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867320.01351: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867320.01374: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867320.01462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.01498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.01527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.01570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.01591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.01691: variable 'ansible_distribution_major_version' from source: facts 18662 1726867320.01783: Evaluated conditional (ansible_distribution_major_version | int < 8): False 18662 1726867320.01786: when evaluation is False, skipping this task 18662 1726867320.01789: _execute() done 18662 1726867320.01791: dumping result to json 18662 1726867320.01795: done dumping result, returning 18662 1726867320.01797: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-efab-a8ce-00000000001f] 18662 1726867320.01800: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000001f 18662 1726867320.01872: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000001f 18662 1726867320.01875: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 18662 1726867320.01931: no more pending results, returning what we have 18662 1726867320.01936: results queue empty 18662 1726867320.01937: checking for any_errors_fatal 18662 1726867320.01943: done checking for any_errors_fatal 18662 1726867320.01944: checking for max_fail_percentage 18662 1726867320.01946: done checking for max_fail_percentage 18662 1726867320.01946: checking to see if all hosts have failed and the running result is not ok 18662 1726867320.01947: done checking to see if all hosts have failed 18662 1726867320.01948: getting the remaining hosts for this loop 18662 1726867320.01949: done getting the remaining hosts for this loop 18662 1726867320.01953: getting the next task for host managed_node2 18662 1726867320.01960: done getting next task for host managed_node2 18662 1726867320.01964: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18662 1726867320.01966: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867320.01982: getting variables 18662 1726867320.01984: in VariableManager get_vars() 18662 1726867320.02024: Calling all_inventory to load vars for managed_node2 18662 1726867320.02027: Calling groups_inventory to load vars for managed_node2 18662 1726867320.02030: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867320.02041: Calling all_plugins_play to load vars for managed_node2 18662 1726867320.02043: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867320.02046: Calling groups_plugins_play to load vars for managed_node2 18662 1726867320.10945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867320.12876: done with get_vars() 18662 1726867320.12900: done getting variables 18662 1726867320.12949: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:22:00 -0400 (0:00:00.177) 0:00:14.765 ****** 18662 1726867320.12980: entering _queue_task() for managed_node2/fail 18662 1726867320.13384: worker is 1 (out of 1 available) 18662 1726867320.13396: exiting _queue_task() for managed_node2/fail 18662 1726867320.13648: done queuing things up, now waiting for results queue to drain 18662 1726867320.13650: waiting for pending results... 18662 1726867320.14103: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18662 1726867320.14111: in run() - task 0affcac9-a3a5-efab-a8ce-000000000020 18662 1726867320.14114: variable 'ansible_search_path' from source: unknown 18662 1726867320.14385: variable 'ansible_search_path' from source: unknown 18662 1726867320.14389: calling self._execute() 18662 1726867320.14585: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867320.14589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867320.14592: variable 'omit' from source: magic vars 18662 1726867320.15249: variable 'ansible_distribution_major_version' from source: facts 18662 1726867320.15266: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867320.15431: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867320.15641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867320.20509: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867320.20636: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867320.20686: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867320.20734: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867320.20793: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867320.20925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.20962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.21010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.21139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.21210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.21284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.21340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.21371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.21475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.21634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.21638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.21641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.21727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.21836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.21865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.22058: variable 'network_connections' from source: play vars 18662 1726867320.22089: variable 'interface' from source: set_fact 18662 1726867320.22191: variable 'interface' from source: set_fact 18662 1726867320.22207: variable 'interface' from source: set_fact 18662 1726867320.22388: variable 'interface' from source: set_fact 18662 1726867320.22393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867320.22612: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867320.22730: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867320.22780: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867320.22856: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867320.22895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867320.22922: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867320.22960: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.22996: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867320.23188: variable '__network_team_connections_defined' from source: role '' defaults 18662 1726867320.23438: variable 'network_connections' from source: play vars 18662 1726867320.23448: variable 'interface' from source: set_fact 18662 1726867320.23582: variable 'interface' from source: set_fact 18662 1726867320.23585: variable 'interface' from source: set_fact 18662 1726867320.23587: variable 'interface' from source: set_fact 18662 1726867320.23619: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18662 1726867320.23627: when evaluation is False, skipping this task 18662 1726867320.23633: _execute() done 18662 1726867320.23639: dumping result to json 18662 1726867320.23645: done dumping result, returning 18662 1726867320.23657: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-efab-a8ce-000000000020] 18662 1726867320.23674: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000020 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18662 1726867320.23823: no more pending results, returning what we have 18662 1726867320.23827: results queue empty 18662 1726867320.23828: checking for any_errors_fatal 18662 1726867320.23835: done checking for any_errors_fatal 18662 1726867320.23836: checking for max_fail_percentage 18662 1726867320.23838: done checking for max_fail_percentage 18662 1726867320.23838: checking to see if all hosts have failed and the running result is not ok 18662 1726867320.23839: done checking to see if all hosts have failed 18662 1726867320.23840: getting the remaining hosts for this loop 18662 1726867320.23841: done getting the remaining hosts for this loop 18662 1726867320.23845: getting the next task for host managed_node2 18662 1726867320.23852: done getting next task for host managed_node2 18662 1726867320.23855: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18662 1726867320.23857: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867320.23870: getting variables 18662 1726867320.23871: in VariableManager get_vars() 18662 1726867320.23913: Calling all_inventory to load vars for managed_node2 18662 1726867320.23915: Calling groups_inventory to load vars for managed_node2 18662 1726867320.23918: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867320.23928: Calling all_plugins_play to load vars for managed_node2 18662 1726867320.23933: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867320.23936: Calling groups_plugins_play to load vars for managed_node2 18662 1726867320.24702: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000020 18662 1726867320.24706: WORKER PROCESS EXITING 18662 1726867320.25596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867320.28929: done with get_vars() 18662 1726867320.28963: done getting variables 18662 1726867320.29130: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:22:00 -0400 (0:00:00.162) 0:00:14.927 ****** 18662 1726867320.29210: entering _queue_task() for managed_node2/package 18662 1726867320.29994: worker is 1 (out of 1 available) 18662 1726867320.30009: exiting _queue_task() for managed_node2/package 18662 1726867320.30021: done queuing things up, now waiting for results queue to drain 18662 1726867320.30023: waiting for pending results... 18662 1726867320.30476: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 18662 1726867320.30761: in run() - task 0affcac9-a3a5-efab-a8ce-000000000021 18662 1726867320.30773: variable 'ansible_search_path' from source: unknown 18662 1726867320.30778: variable 'ansible_search_path' from source: unknown 18662 1726867320.30822: calling self._execute() 18662 1726867320.31041: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867320.31047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867320.31056: variable 'omit' from source: magic vars 18662 1726867320.32428: variable 'ansible_distribution_major_version' from source: facts 18662 1726867320.32486: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867320.32772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867320.32989: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867320.33039: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867320.33082: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867320.33219: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867320.33438: variable 'network_packages' from source: role '' defaults 18662 1726867320.33598: variable '__network_provider_setup' from source: role '' defaults 18662 1726867320.33637: variable '__network_service_name_default_nm' from source: role '' defaults 18662 1726867320.33748: variable '__network_service_name_default_nm' from source: role '' defaults 18662 1726867320.33863: variable '__network_packages_default_nm' from source: role '' defaults 18662 1726867320.33866: variable '__network_packages_default_nm' from source: role '' defaults 18662 1726867320.34128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867320.37713: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867320.37771: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867320.37808: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867320.37843: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867320.37897: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867320.37973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.38009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.38036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.38076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.38097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.38150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.38173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.38210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.38254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.38264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.38758: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18662 1726867320.39123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.39145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.39165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.39224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.39235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.39423: variable 'ansible_python' from source: facts 18662 1726867320.39427: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18662 1726867320.39481: variable '__network_wpa_supplicant_required' from source: role '' defaults 18662 1726867320.39615: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18662 1726867320.39811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.39837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.39859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.39898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.39940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.39988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.40006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.40035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.40184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.40188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.40330: variable 'network_connections' from source: play vars 18662 1726867320.40336: variable 'interface' from source: set_fact 18662 1726867320.40511: variable 'interface' from source: set_fact 18662 1726867320.40524: variable 'interface' from source: set_fact 18662 1726867320.40708: variable 'interface' from source: set_fact 18662 1726867320.40835: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867320.40838: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867320.40841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.40866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867320.40921: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867320.41585: variable 'network_connections' from source: play vars 18662 1726867320.41589: variable 'interface' from source: set_fact 18662 1726867320.41593: variable 'interface' from source: set_fact 18662 1726867320.41596: variable 'interface' from source: set_fact 18662 1726867320.41687: variable 'interface' from source: set_fact 18662 1726867320.41871: variable '__network_packages_default_wireless' from source: role '' defaults 18662 1726867320.41967: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867320.42426: variable 'network_connections' from source: play vars 18662 1726867320.42432: variable 'interface' from source: set_fact 18662 1726867320.42494: variable 'interface' from source: set_fact 18662 1726867320.42529: variable 'interface' from source: set_fact 18662 1726867320.42781: variable 'interface' from source: set_fact 18662 1726867320.42784: variable '__network_packages_default_team' from source: role '' defaults 18662 1726867320.42786: variable '__network_team_connections_defined' from source: role '' defaults 18662 1726867320.43299: variable 'network_connections' from source: play vars 18662 1726867320.43303: variable 'interface' from source: set_fact 18662 1726867320.43367: variable 'interface' from source: set_fact 18662 1726867320.43374: variable 'interface' from source: set_fact 18662 1726867320.43445: variable 'interface' from source: set_fact 18662 1726867320.43582: variable '__network_service_name_default_initscripts' from source: role '' defaults 18662 1726867320.43585: variable '__network_service_name_default_initscripts' from source: role '' defaults 18662 1726867320.43588: variable '__network_packages_default_initscripts' from source: role '' defaults 18662 1726867320.43634: variable '__network_packages_default_initscripts' from source: role '' defaults 18662 1726867320.43867: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18662 1726867320.44393: variable 'network_connections' from source: play vars 18662 1726867320.44399: variable 'interface' from source: set_fact 18662 1726867320.44489: variable 'interface' from source: set_fact 18662 1726867320.44497: variable 'interface' from source: set_fact 18662 1726867320.44612: variable 'interface' from source: set_fact 18662 1726867320.44616: variable 'ansible_distribution' from source: facts 18662 1726867320.44620: variable '__network_rh_distros' from source: role '' defaults 18662 1726867320.44626: variable 'ansible_distribution_major_version' from source: facts 18662 1726867320.44721: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18662 1726867320.44874: variable 'ansible_distribution' from source: facts 18662 1726867320.44878: variable '__network_rh_distros' from source: role '' defaults 18662 1726867320.44964: variable 'ansible_distribution_major_version' from source: facts 18662 1726867320.44967: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18662 1726867320.45244: variable 'ansible_distribution' from source: facts 18662 1726867320.45247: variable '__network_rh_distros' from source: role '' defaults 18662 1726867320.45255: variable 'ansible_distribution_major_version' from source: facts 18662 1726867320.45295: variable 'network_provider' from source: set_fact 18662 1726867320.45482: variable 'ansible_facts' from source: unknown 18662 1726867320.46464: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 18662 1726867320.46467: when evaluation is False, skipping this task 18662 1726867320.46470: _execute() done 18662 1726867320.46472: dumping result to json 18662 1726867320.46474: done dumping result, returning 18662 1726867320.46503: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-efab-a8ce-000000000021] 18662 1726867320.46506: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000021 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 18662 1726867320.46813: no more pending results, returning what we have 18662 1726867320.46816: results queue empty 18662 1726867320.46817: checking for any_errors_fatal 18662 1726867320.46822: done checking for any_errors_fatal 18662 1726867320.46823: checking for max_fail_percentage 18662 1726867320.46824: done checking for max_fail_percentage 18662 1726867320.46825: checking to see if all hosts have failed and the running result is not ok 18662 1726867320.46826: done checking to see if all hosts have failed 18662 1726867320.46826: getting the remaining hosts for this loop 18662 1726867320.46827: done getting the remaining hosts for this loop 18662 1726867320.46830: getting the next task for host managed_node2 18662 1726867320.46835: done getting next task for host managed_node2 18662 1726867320.46839: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18662 1726867320.46841: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867320.46853: getting variables 18662 1726867320.46855: in VariableManager get_vars() 18662 1726867320.46890: Calling all_inventory to load vars for managed_node2 18662 1726867320.46892: Calling groups_inventory to load vars for managed_node2 18662 1726867320.46894: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867320.46903: Calling all_plugins_play to load vars for managed_node2 18662 1726867320.46916: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867320.46921: Calling groups_plugins_play to load vars for managed_node2 18662 1726867320.47476: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000021 18662 1726867320.47481: WORKER PROCESS EXITING 18662 1726867320.50750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867320.54443: done with get_vars() 18662 1726867320.54479: done getting variables 18662 1726867320.54772: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:22:00 -0400 (0:00:00.257) 0:00:15.185 ****** 18662 1726867320.54990: entering _queue_task() for managed_node2/package 18662 1726867320.55838: worker is 1 (out of 1 available) 18662 1726867320.55848: exiting _queue_task() for managed_node2/package 18662 1726867320.55859: done queuing things up, now waiting for results queue to drain 18662 1726867320.55860: waiting for pending results... 18662 1726867320.56252: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18662 1726867320.56258: in run() - task 0affcac9-a3a5-efab-a8ce-000000000022 18662 1726867320.56261: variable 'ansible_search_path' from source: unknown 18662 1726867320.56264: variable 'ansible_search_path' from source: unknown 18662 1726867320.56371: calling self._execute() 18662 1726867320.56469: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867320.56487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867320.56498: variable 'omit' from source: magic vars 18662 1726867320.56914: variable 'ansible_distribution_major_version' from source: facts 18662 1726867320.56959: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867320.57194: variable 'network_state' from source: role '' defaults 18662 1726867320.57196: Evaluated conditional (network_state != {}): False 18662 1726867320.57198: when evaluation is False, skipping this task 18662 1726867320.57200: _execute() done 18662 1726867320.57202: dumping result to json 18662 1726867320.57203: done dumping result, returning 18662 1726867320.57206: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-efab-a8ce-000000000022] 18662 1726867320.57208: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000022 18662 1726867320.57497: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000022 18662 1726867320.57501: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18662 1726867320.57548: no more pending results, returning what we have 18662 1726867320.57551: results queue empty 18662 1726867320.57552: checking for any_errors_fatal 18662 1726867320.57557: done checking for any_errors_fatal 18662 1726867320.57558: checking for max_fail_percentage 18662 1726867320.57559: done checking for max_fail_percentage 18662 1726867320.57560: checking to see if all hosts have failed and the running result is not ok 18662 1726867320.57561: done checking to see if all hosts have failed 18662 1726867320.57561: getting the remaining hosts for this loop 18662 1726867320.57563: done getting the remaining hosts for this loop 18662 1726867320.57566: getting the next task for host managed_node2 18662 1726867320.57571: done getting next task for host managed_node2 18662 1726867320.57575: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18662 1726867320.57579: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867320.57592: getting variables 18662 1726867320.57594: in VariableManager get_vars() 18662 1726867320.57635: Calling all_inventory to load vars for managed_node2 18662 1726867320.57638: Calling groups_inventory to load vars for managed_node2 18662 1726867320.57640: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867320.57649: Calling all_plugins_play to load vars for managed_node2 18662 1726867320.57766: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867320.57775: Calling groups_plugins_play to load vars for managed_node2 18662 1726867320.60650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867320.64096: done with get_vars() 18662 1726867320.64158: done getting variables 18662 1726867320.64218: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:22:00 -0400 (0:00:00.093) 0:00:15.278 ****** 18662 1726867320.64336: entering _queue_task() for managed_node2/package 18662 1726867320.64783: worker is 1 (out of 1 available) 18662 1726867320.64795: exiting _queue_task() for managed_node2/package 18662 1726867320.64810: done queuing things up, now waiting for results queue to drain 18662 1726867320.64812: waiting for pending results... 18662 1726867320.65049: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18662 1726867320.65316: in run() - task 0affcac9-a3a5-efab-a8ce-000000000023 18662 1726867320.65320: variable 'ansible_search_path' from source: unknown 18662 1726867320.65322: variable 'ansible_search_path' from source: unknown 18662 1726867320.65325: calling self._execute() 18662 1726867320.65327: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867320.65330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867320.65333: variable 'omit' from source: magic vars 18662 1726867320.65701: variable 'ansible_distribution_major_version' from source: facts 18662 1726867320.65717: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867320.65837: variable 'network_state' from source: role '' defaults 18662 1726867320.65873: Evaluated conditional (network_state != {}): False 18662 1726867320.65879: when evaluation is False, skipping this task 18662 1726867320.65882: _execute() done 18662 1726867320.65884: dumping result to json 18662 1726867320.65886: done dumping result, returning 18662 1726867320.65889: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-efab-a8ce-000000000023] 18662 1726867320.65891: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000023 18662 1726867320.66017: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000023 18662 1726867320.66021: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18662 1726867320.66076: no more pending results, returning what we have 18662 1726867320.66082: results queue empty 18662 1726867320.66083: checking for any_errors_fatal 18662 1726867320.66089: done checking for any_errors_fatal 18662 1726867320.66090: checking for max_fail_percentage 18662 1726867320.66092: done checking for max_fail_percentage 18662 1726867320.66093: checking to see if all hosts have failed and the running result is not ok 18662 1726867320.66094: done checking to see if all hosts have failed 18662 1726867320.66095: getting the remaining hosts for this loop 18662 1726867320.66096: done getting the remaining hosts for this loop 18662 1726867320.66099: getting the next task for host managed_node2 18662 1726867320.66109: done getting next task for host managed_node2 18662 1726867320.66113: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18662 1726867320.66116: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867320.66131: getting variables 18662 1726867320.66133: in VariableManager get_vars() 18662 1726867320.66172: Calling all_inventory to load vars for managed_node2 18662 1726867320.66174: Calling groups_inventory to load vars for managed_node2 18662 1726867320.66434: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867320.66446: Calling all_plugins_play to load vars for managed_node2 18662 1726867320.66449: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867320.66452: Calling groups_plugins_play to load vars for managed_node2 18662 1726867320.68813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867320.70039: done with get_vars() 18662 1726867320.70054: done getting variables 18662 1726867320.70130: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:22:00 -0400 (0:00:00.058) 0:00:15.337 ****** 18662 1726867320.70150: entering _queue_task() for managed_node2/service 18662 1726867320.70151: Creating lock for service 18662 1726867320.70374: worker is 1 (out of 1 available) 18662 1726867320.70388: exiting _queue_task() for managed_node2/service 18662 1726867320.70400: done queuing things up, now waiting for results queue to drain 18662 1726867320.70401: waiting for pending results... 18662 1726867320.70563: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18662 1726867320.70636: in run() - task 0affcac9-a3a5-efab-a8ce-000000000024 18662 1726867320.70647: variable 'ansible_search_path' from source: unknown 18662 1726867320.70651: variable 'ansible_search_path' from source: unknown 18662 1726867320.70679: calling self._execute() 18662 1726867320.70749: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867320.70754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867320.70763: variable 'omit' from source: magic vars 18662 1726867320.71222: variable 'ansible_distribution_major_version' from source: facts 18662 1726867320.71226: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867320.71228: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867320.71388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867320.74161: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867320.74295: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867320.74403: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867320.74440: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867320.74472: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867320.74548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.74585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.74650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.74681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.74697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.74754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.74773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.74976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.74982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.74985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.75020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.75085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.75089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.75336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.75352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.75718: variable 'network_connections' from source: play vars 18662 1726867320.75741: variable 'interface' from source: set_fact 18662 1726867320.75848: variable 'interface' from source: set_fact 18662 1726867320.75853: variable 'interface' from source: set_fact 18662 1726867320.75867: variable 'interface' from source: set_fact 18662 1726867320.76066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867320.76903: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867320.76911: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867320.76914: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867320.76982: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867320.76995: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867320.77041: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867320.77071: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.77112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867320.77191: variable '__network_team_connections_defined' from source: role '' defaults 18662 1726867320.77556: variable 'network_connections' from source: play vars 18662 1726867320.77559: variable 'interface' from source: set_fact 18662 1726867320.77581: variable 'interface' from source: set_fact 18662 1726867320.77592: variable 'interface' from source: set_fact 18662 1726867320.77663: variable 'interface' from source: set_fact 18662 1726867320.77707: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18662 1726867320.77721: when evaluation is False, skipping this task 18662 1726867320.77729: _execute() done 18662 1726867320.77737: dumping result to json 18662 1726867320.77775: done dumping result, returning 18662 1726867320.77785: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-efab-a8ce-000000000024] 18662 1726867320.77795: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000024 18662 1726867320.78086: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000024 18662 1726867320.78089: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18662 1726867320.78143: no more pending results, returning what we have 18662 1726867320.78147: results queue empty 18662 1726867320.78148: checking for any_errors_fatal 18662 1726867320.78155: done checking for any_errors_fatal 18662 1726867320.78156: checking for max_fail_percentage 18662 1726867320.78158: done checking for max_fail_percentage 18662 1726867320.78159: checking to see if all hosts have failed and the running result is not ok 18662 1726867320.78160: done checking to see if all hosts have failed 18662 1726867320.78160: getting the remaining hosts for this loop 18662 1726867320.78162: done getting the remaining hosts for this loop 18662 1726867320.78165: getting the next task for host managed_node2 18662 1726867320.78173: done getting next task for host managed_node2 18662 1726867320.78189: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18662 1726867320.78191: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867320.78206: getting variables 18662 1726867320.78210: in VariableManager get_vars() 18662 1726867320.78251: Calling all_inventory to load vars for managed_node2 18662 1726867320.78254: Calling groups_inventory to load vars for managed_node2 18662 1726867320.78256: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867320.78268: Calling all_plugins_play to load vars for managed_node2 18662 1726867320.78271: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867320.78274: Calling groups_plugins_play to load vars for managed_node2 18662 1726867320.81028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867320.82866: done with get_vars() 18662 1726867320.82893: done getting variables 18662 1726867320.83164: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:22:00 -0400 (0:00:00.132) 0:00:15.469 ****** 18662 1726867320.83398: entering _queue_task() for managed_node2/service 18662 1726867320.83928: worker is 1 (out of 1 available) 18662 1726867320.83941: exiting _queue_task() for managed_node2/service 18662 1726867320.83954: done queuing things up, now waiting for results queue to drain 18662 1726867320.83955: waiting for pending results... 18662 1726867320.84316: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18662 1726867320.84470: in run() - task 0affcac9-a3a5-efab-a8ce-000000000025 18662 1726867320.84474: variable 'ansible_search_path' from source: unknown 18662 1726867320.84481: variable 'ansible_search_path' from source: unknown 18662 1726867320.84580: calling self._execute() 18662 1726867320.84788: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867320.84795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867320.84804: variable 'omit' from source: magic vars 18662 1726867320.85319: variable 'ansible_distribution_major_version' from source: facts 18662 1726867320.85323: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867320.85484: variable 'network_provider' from source: set_fact 18662 1726867320.85487: variable 'network_state' from source: role '' defaults 18662 1726867320.85555: Evaluated conditional (network_provider == "nm" or network_state != {}): True 18662 1726867320.85559: variable 'omit' from source: magic vars 18662 1726867320.85562: variable 'omit' from source: magic vars 18662 1726867320.85583: variable 'network_service_name' from source: role '' defaults 18662 1726867320.85661: variable 'network_service_name' from source: role '' defaults 18662 1726867320.85775: variable '__network_provider_setup' from source: role '' defaults 18662 1726867320.85794: variable '__network_service_name_default_nm' from source: role '' defaults 18662 1726867320.85848: variable '__network_service_name_default_nm' from source: role '' defaults 18662 1726867320.85857: variable '__network_packages_default_nm' from source: role '' defaults 18662 1726867320.85928: variable '__network_packages_default_nm' from source: role '' defaults 18662 1726867320.86242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867320.90598: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867320.90721: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867320.90795: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867320.90833: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867320.90972: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867320.91052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.91194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.91224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.91382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.91386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.91443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.91465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.91594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.91638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.91651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.92170: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18662 1726867320.92415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.92438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.92462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.92604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.92633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.92819: variable 'ansible_python' from source: facts 18662 1726867320.92847: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18662 1726867320.93093: variable '__network_wpa_supplicant_required' from source: role '' defaults 18662 1726867320.93171: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18662 1726867320.93444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.93467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.93595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.93640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.93662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.93806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867320.93834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867320.93869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.93908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867320.93925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867320.94067: variable 'network_connections' from source: play vars 18662 1726867320.94074: variable 'interface' from source: set_fact 18662 1726867320.94150: variable 'interface' from source: set_fact 18662 1726867320.94160: variable 'interface' from source: set_fact 18662 1726867320.94297: variable 'interface' from source: set_fact 18662 1726867320.94352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867320.94556: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867320.94608: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867320.94652: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867320.94692: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867320.94770: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867320.94786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867320.94876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867320.94881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867320.94902: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867320.95190: variable 'network_connections' from source: play vars 18662 1726867320.95216: variable 'interface' from source: set_fact 18662 1726867320.95272: variable 'interface' from source: set_fact 18662 1726867320.95284: variable 'interface' from source: set_fact 18662 1726867320.95482: variable 'interface' from source: set_fact 18662 1726867320.95485: variable '__network_packages_default_wireless' from source: role '' defaults 18662 1726867320.95488: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867320.95837: variable 'network_connections' from source: play vars 18662 1726867320.95934: variable 'interface' from source: set_fact 18662 1726867320.95937: variable 'interface' from source: set_fact 18662 1726867320.95940: variable 'interface' from source: set_fact 18662 1726867320.96009: variable 'interface' from source: set_fact 18662 1726867320.96088: variable '__network_packages_default_team' from source: role '' defaults 18662 1726867320.96325: variable '__network_team_connections_defined' from source: role '' defaults 18662 1726867320.97138: variable 'network_connections' from source: play vars 18662 1726867320.97200: variable 'interface' from source: set_fact 18662 1726867320.97322: variable 'interface' from source: set_fact 18662 1726867320.97326: variable 'interface' from source: set_fact 18662 1726867320.97458: variable 'interface' from source: set_fact 18662 1726867320.97571: variable '__network_service_name_default_initscripts' from source: role '' defaults 18662 1726867320.97636: variable '__network_service_name_default_initscripts' from source: role '' defaults 18662 1726867320.97644: variable '__network_packages_default_initscripts' from source: role '' defaults 18662 1726867320.97822: variable '__network_packages_default_initscripts' from source: role '' defaults 18662 1726867320.98147: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18662 1726867320.98782: variable 'network_connections' from source: play vars 18662 1726867320.98785: variable 'interface' from source: set_fact 18662 1726867320.98788: variable 'interface' from source: set_fact 18662 1726867320.98790: variable 'interface' from source: set_fact 18662 1726867320.98818: variable 'interface' from source: set_fact 18662 1726867320.98825: variable 'ansible_distribution' from source: facts 18662 1726867320.98828: variable '__network_rh_distros' from source: role '' defaults 18662 1726867320.98834: variable 'ansible_distribution_major_version' from source: facts 18662 1726867320.98862: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18662 1726867320.99083: variable 'ansible_distribution' from source: facts 18662 1726867320.99087: variable '__network_rh_distros' from source: role '' defaults 18662 1726867320.99089: variable 'ansible_distribution_major_version' from source: facts 18662 1726867320.99092: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18662 1726867320.99287: variable 'ansible_distribution' from source: facts 18662 1726867320.99290: variable '__network_rh_distros' from source: role '' defaults 18662 1726867320.99295: variable 'ansible_distribution_major_version' from source: facts 18662 1726867320.99333: variable 'network_provider' from source: set_fact 18662 1726867320.99357: variable 'omit' from source: magic vars 18662 1726867320.99396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867320.99437: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867320.99458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867320.99486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867320.99506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867320.99570: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867320.99582: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867320.99588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867320.99651: Set connection var ansible_timeout to 10 18662 1726867320.99655: Set connection var ansible_connection to ssh 18662 1726867320.99661: Set connection var ansible_shell_executable to /bin/sh 18662 1726867320.99663: Set connection var ansible_shell_type to sh 18662 1726867320.99673: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867320.99679: Set connection var ansible_pipelining to False 18662 1726867320.99705: variable 'ansible_shell_executable' from source: unknown 18662 1726867320.99715: variable 'ansible_connection' from source: unknown 18662 1726867320.99719: variable 'ansible_module_compression' from source: unknown 18662 1726867320.99721: variable 'ansible_shell_type' from source: unknown 18662 1726867320.99724: variable 'ansible_shell_executable' from source: unknown 18662 1726867320.99726: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867320.99732: variable 'ansible_pipelining' from source: unknown 18662 1726867320.99734: variable 'ansible_timeout' from source: unknown 18662 1726867320.99736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867320.99845: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867320.99853: variable 'omit' from source: magic vars 18662 1726867320.99859: starting attempt loop 18662 1726867320.99862: running the handler 18662 1726867320.99945: variable 'ansible_facts' from source: unknown 18662 1726867321.00719: _low_level_execute_command(): starting 18662 1726867321.00726: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867321.01383: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867321.01410: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867321.01452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867321.01503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867321.01518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867321.01535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867321.01611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867321.03300: stdout chunk (state=3): >>>/root <<< 18662 1726867321.03396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867321.03502: stderr chunk (state=3): >>><<< 18662 1726867321.03505: stdout chunk (state=3): >>><<< 18662 1726867321.03508: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867321.03511: _low_level_execute_command(): starting 18662 1726867321.03514: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867321.0344539-19368-278019993177039 `" && echo ansible-tmp-1726867321.0344539-19368-278019993177039="` echo /root/.ansible/tmp/ansible-tmp-1726867321.0344539-19368-278019993177039 `" ) && sleep 0' 18662 1726867321.04063: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867321.04073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867321.04135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867321.04143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867321.04196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867321.04291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867321.04300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867321.04333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867321.06286: stdout chunk (state=3): >>>ansible-tmp-1726867321.0344539-19368-278019993177039=/root/.ansible/tmp/ansible-tmp-1726867321.0344539-19368-278019993177039 <<< 18662 1726867321.06403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867321.06439: stderr chunk (state=3): >>><<< 18662 1726867321.06445: stdout chunk (state=3): >>><<< 18662 1726867321.06554: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867321.0344539-19368-278019993177039=/root/.ansible/tmp/ansible-tmp-1726867321.0344539-19368-278019993177039 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867321.06557: variable 'ansible_module_compression' from source: unknown 18662 1726867321.06589: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 18662 1726867321.06593: ANSIBALLZ: Acquiring lock 18662 1726867321.06600: ANSIBALLZ: Lock acquired: 140264020905808 18662 1726867321.06603: ANSIBALLZ: Creating module 18662 1726867321.34085: ANSIBALLZ: Writing module into payload 18662 1726867321.34204: ANSIBALLZ: Writing module 18662 1726867321.34482: ANSIBALLZ: Renaming module 18662 1726867321.34561: ANSIBALLZ: Done creating module 18662 1726867321.34603: variable 'ansible_facts' from source: unknown 18662 1726867321.34983: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867321.0344539-19368-278019993177039/AnsiballZ_systemd.py 18662 1726867321.35198: Sending initial data 18662 1726867321.35207: Sent initial data (156 bytes) 18662 1726867321.36547: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867321.36795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867321.36974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867321.36979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867321.37092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867321.37158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867321.38841: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867321.38871: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867321.38938: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpz_x2rlnv /root/.ansible/tmp/ansible-tmp-1726867321.0344539-19368-278019993177039/AnsiballZ_systemd.py <<< 18662 1726867321.38941: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867321.0344539-19368-278019993177039/AnsiballZ_systemd.py" <<< 18662 1726867321.39017: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpz_x2rlnv" to remote "/root/.ansible/tmp/ansible-tmp-1726867321.0344539-19368-278019993177039/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867321.0344539-19368-278019993177039/AnsiballZ_systemd.py" <<< 18662 1726867321.42110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867321.42155: stderr chunk (state=3): >>><<< 18662 1726867321.42164: stdout chunk (state=3): >>><<< 18662 1726867321.42349: done transferring module to remote 18662 1726867321.42352: _low_level_execute_command(): starting 18662 1726867321.42355: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867321.0344539-19368-278019993177039/ /root/.ansible/tmp/ansible-tmp-1726867321.0344539-19368-278019993177039/AnsiballZ_systemd.py && sleep 0' 18662 1726867321.43493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867321.43601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867321.43776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867321.43798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867321.43875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867321.45782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867321.45793: stdout chunk (state=3): >>><<< 18662 1726867321.45803: stderr chunk (state=3): >>><<< 18662 1726867321.45829: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867321.45838: _low_level_execute_command(): starting 18662 1726867321.45847: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867321.0344539-19368-278019993177039/AnsiballZ_systemd.py && sleep 0' 18662 1726867321.47494: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867321.47564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867321.47583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867321.48017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867321.48065: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867321.77964: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4513792", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3326504960", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "843888000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 18662 1726867321.78000: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 18662 1726867321.79930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867321.79934: stdout chunk (state=3): >>><<< 18662 1726867321.79952: stderr chunk (state=3): >>><<< 18662 1726867321.79969: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4513792", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3326504960", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "843888000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867321.80375: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867321.0344539-19368-278019993177039/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867321.80582: _low_level_execute_command(): starting 18662 1726867321.80586: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867321.0344539-19368-278019993177039/ > /dev/null 2>&1 && sleep 0' 18662 1726867321.81655: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867321.81730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867321.81745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867321.81768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867321.81797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867321.81970: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867321.82116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867321.82150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867321.84050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867321.84060: stdout chunk (state=3): >>><<< 18662 1726867321.84384: stderr chunk (state=3): >>><<< 18662 1726867321.84387: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867321.84390: handler run complete 18662 1726867321.84392: attempt loop complete, returning result 18662 1726867321.84395: _execute() done 18662 1726867321.84397: dumping result to json 18662 1726867321.84421: done dumping result, returning 18662 1726867321.84498: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-efab-a8ce-000000000025] 18662 1726867321.84501: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000025 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18662 1726867321.87683: no more pending results, returning what we have 18662 1726867321.87687: results queue empty 18662 1726867321.87688: checking for any_errors_fatal 18662 1726867321.87694: done checking for any_errors_fatal 18662 1726867321.87695: checking for max_fail_percentage 18662 1726867321.87697: done checking for max_fail_percentage 18662 1726867321.87698: checking to see if all hosts have failed and the running result is not ok 18662 1726867321.87699: done checking to see if all hosts have failed 18662 1726867321.87699: getting the remaining hosts for this loop 18662 1726867321.87701: done getting the remaining hosts for this loop 18662 1726867321.87704: getting the next task for host managed_node2 18662 1726867321.87711: done getting next task for host managed_node2 18662 1726867321.87715: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18662 1726867321.87717: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867321.87727: getting variables 18662 1726867321.87729: in VariableManager get_vars() 18662 1726867321.87765: Calling all_inventory to load vars for managed_node2 18662 1726867321.87767: Calling groups_inventory to load vars for managed_node2 18662 1726867321.87770: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867321.87785: Calling all_plugins_play to load vars for managed_node2 18662 1726867321.87789: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867321.87792: Calling groups_plugins_play to load vars for managed_node2 18662 1726867321.89510: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000025 18662 1726867321.89516: WORKER PROCESS EXITING 18662 1726867321.92132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867321.95544: done with get_vars() 18662 1726867321.95571: done getting variables 18662 1726867321.95644: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:22:01 -0400 (0:00:01.122) 0:00:16.592 ****** 18662 1726867321.95676: entering _queue_task() for managed_node2/service 18662 1726867321.96004: worker is 1 (out of 1 available) 18662 1726867321.96018: exiting _queue_task() for managed_node2/service 18662 1726867321.96141: done queuing things up, now waiting for results queue to drain 18662 1726867321.96143: waiting for pending results... 18662 1726867321.96324: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18662 1726867321.96439: in run() - task 0affcac9-a3a5-efab-a8ce-000000000026 18662 1726867321.96464: variable 'ansible_search_path' from source: unknown 18662 1726867321.96478: variable 'ansible_search_path' from source: unknown 18662 1726867321.96520: calling self._execute() 18662 1726867321.96625: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867321.96636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867321.96651: variable 'omit' from source: magic vars 18662 1726867321.97263: variable 'ansible_distribution_major_version' from source: facts 18662 1726867321.97281: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867321.97695: variable 'network_provider' from source: set_fact 18662 1726867321.97699: Evaluated conditional (network_provider == "nm"): True 18662 1726867321.97819: variable '__network_wpa_supplicant_required' from source: role '' defaults 18662 1726867321.97947: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18662 1726867321.98348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867322.01651: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867322.01793: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867322.02058: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867322.02062: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867322.02065: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867322.02207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867322.02244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867322.02302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867322.02428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867322.02448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867322.02682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867322.02686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867322.02688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867322.02730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867322.02799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867322.02856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867322.02954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867322.02987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867322.03473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867322.03479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867322.03641: variable 'network_connections' from source: play vars 18662 1726867322.03907: variable 'interface' from source: set_fact 18662 1726867322.03913: variable 'interface' from source: set_fact 18662 1726867322.04125: variable 'interface' from source: set_fact 18662 1726867322.04161: variable 'interface' from source: set_fact 18662 1726867322.04427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867322.05059: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867322.05223: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867322.05258: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867322.05344: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867322.05746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867322.05749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867322.05752: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867322.05754: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867322.05990: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867322.06486: variable 'network_connections' from source: play vars 18662 1726867322.06617: variable 'interface' from source: set_fact 18662 1726867322.06667: variable 'interface' from source: set_fact 18662 1726867322.06732: variable 'interface' from source: set_fact 18662 1726867322.06796: variable 'interface' from source: set_fact 18662 1726867322.07050: Evaluated conditional (__network_wpa_supplicant_required): False 18662 1726867322.07053: when evaluation is False, skipping this task 18662 1726867322.07056: _execute() done 18662 1726867322.07068: dumping result to json 18662 1726867322.07071: done dumping result, returning 18662 1726867322.07073: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-efab-a8ce-000000000026] 18662 1726867322.07075: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000026 18662 1726867322.07155: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000026 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 18662 1726867322.07315: no more pending results, returning what we have 18662 1726867322.07319: results queue empty 18662 1726867322.07320: checking for any_errors_fatal 18662 1726867322.07340: done checking for any_errors_fatal 18662 1726867322.07341: checking for max_fail_percentage 18662 1726867322.07343: done checking for max_fail_percentage 18662 1726867322.07344: checking to see if all hosts have failed and the running result is not ok 18662 1726867322.07344: done checking to see if all hosts have failed 18662 1726867322.07345: getting the remaining hosts for this loop 18662 1726867322.07347: done getting the remaining hosts for this loop 18662 1726867322.07350: getting the next task for host managed_node2 18662 1726867322.07358: done getting next task for host managed_node2 18662 1726867322.07363: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18662 1726867322.07365: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867322.07384: getting variables 18662 1726867322.07387: in VariableManager get_vars() 18662 1726867322.07433: Calling all_inventory to load vars for managed_node2 18662 1726867322.07436: Calling groups_inventory to load vars for managed_node2 18662 1726867322.07439: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867322.07451: Calling all_plugins_play to load vars for managed_node2 18662 1726867322.07454: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867322.07457: Calling groups_plugins_play to load vars for managed_node2 18662 1726867322.08294: WORKER PROCESS EXITING 18662 1726867322.11123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867322.14792: done with get_vars() 18662 1726867322.14820: done getting variables 18662 1726867322.14879: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:22:02 -0400 (0:00:00.192) 0:00:16.784 ****** 18662 1726867322.14905: entering _queue_task() for managed_node2/service 18662 1726867322.15523: worker is 1 (out of 1 available) 18662 1726867322.15534: exiting _queue_task() for managed_node2/service 18662 1726867322.15544: done queuing things up, now waiting for results queue to drain 18662 1726867322.15545: waiting for pending results... 18662 1726867322.16295: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 18662 1726867322.16300: in run() - task 0affcac9-a3a5-efab-a8ce-000000000027 18662 1726867322.16304: variable 'ansible_search_path' from source: unknown 18662 1726867322.16307: variable 'ansible_search_path' from source: unknown 18662 1726867322.16314: calling self._execute() 18662 1726867322.16587: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867322.16601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867322.16618: variable 'omit' from source: magic vars 18662 1726867322.17346: variable 'ansible_distribution_major_version' from source: facts 18662 1726867322.17426: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867322.17657: variable 'network_provider' from source: set_fact 18662 1726867322.17669: Evaluated conditional (network_provider == "initscripts"): False 18662 1726867322.17676: when evaluation is False, skipping this task 18662 1726867322.17741: _execute() done 18662 1726867322.17750: dumping result to json 18662 1726867322.17757: done dumping result, returning 18662 1726867322.17768: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-efab-a8ce-000000000027] 18662 1726867322.17780: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000027 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18662 1726867322.18029: no more pending results, returning what we have 18662 1726867322.18034: results queue empty 18662 1726867322.18035: checking for any_errors_fatal 18662 1726867322.18045: done checking for any_errors_fatal 18662 1726867322.18045: checking for max_fail_percentage 18662 1726867322.18048: done checking for max_fail_percentage 18662 1726867322.18049: checking to see if all hosts have failed and the running result is not ok 18662 1726867322.18049: done checking to see if all hosts have failed 18662 1726867322.18050: getting the remaining hosts for this loop 18662 1726867322.18051: done getting the remaining hosts for this loop 18662 1726867322.18056: getting the next task for host managed_node2 18662 1726867322.18063: done getting next task for host managed_node2 18662 1726867322.18067: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18662 1726867322.18071: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867322.18087: getting variables 18662 1726867322.18089: in VariableManager get_vars() 18662 1726867322.18129: Calling all_inventory to load vars for managed_node2 18662 1726867322.18132: Calling groups_inventory to load vars for managed_node2 18662 1726867322.18135: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867322.18148: Calling all_plugins_play to load vars for managed_node2 18662 1726867322.18151: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867322.18155: Calling groups_plugins_play to load vars for managed_node2 18662 1726867322.18994: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000027 18662 1726867322.18997: WORKER PROCESS EXITING 18662 1726867322.21116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867322.24652: done with get_vars() 18662 1726867322.24673: done getting variables 18662 1726867322.24735: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:22:02 -0400 (0:00:00.098) 0:00:16.883 ****** 18662 1726867322.24765: entering _queue_task() for managed_node2/copy 18662 1726867322.25499: worker is 1 (out of 1 available) 18662 1726867322.25513: exiting _queue_task() for managed_node2/copy 18662 1726867322.25524: done queuing things up, now waiting for results queue to drain 18662 1726867322.25525: waiting for pending results... 18662 1726867322.26200: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18662 1726867322.26542: in run() - task 0affcac9-a3a5-efab-a8ce-000000000028 18662 1726867322.26557: variable 'ansible_search_path' from source: unknown 18662 1726867322.26561: variable 'ansible_search_path' from source: unknown 18662 1726867322.26759: calling self._execute() 18662 1726867322.26985: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867322.26990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867322.26993: variable 'omit' from source: magic vars 18662 1726867322.28037: variable 'ansible_distribution_major_version' from source: facts 18662 1726867322.28041: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867322.28226: variable 'network_provider' from source: set_fact 18662 1726867322.28232: Evaluated conditional (network_provider == "initscripts"): False 18662 1726867322.28235: when evaluation is False, skipping this task 18662 1726867322.28238: _execute() done 18662 1726867322.28240: dumping result to json 18662 1726867322.28243: done dumping result, returning 18662 1726867322.28254: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-efab-a8ce-000000000028] 18662 1726867322.28257: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000028 18662 1726867322.28401: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000028 18662 1726867322.28404: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 18662 1726867322.28460: no more pending results, returning what we have 18662 1726867322.28465: results queue empty 18662 1726867322.28466: checking for any_errors_fatal 18662 1726867322.28473: done checking for any_errors_fatal 18662 1726867322.28474: checking for max_fail_percentage 18662 1726867322.28476: done checking for max_fail_percentage 18662 1726867322.28479: checking to see if all hosts have failed and the running result is not ok 18662 1726867322.28480: done checking to see if all hosts have failed 18662 1726867322.28480: getting the remaining hosts for this loop 18662 1726867322.28482: done getting the remaining hosts for this loop 18662 1726867322.28486: getting the next task for host managed_node2 18662 1726867322.28492: done getting next task for host managed_node2 18662 1726867322.28496: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18662 1726867322.28499: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867322.28516: getting variables 18662 1726867322.28518: in VariableManager get_vars() 18662 1726867322.28555: Calling all_inventory to load vars for managed_node2 18662 1726867322.28558: Calling groups_inventory to load vars for managed_node2 18662 1726867322.28560: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867322.28571: Calling all_plugins_play to load vars for managed_node2 18662 1726867322.28573: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867322.28576: Calling groups_plugins_play to load vars for managed_node2 18662 1726867322.31827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867322.35030: done with get_vars() 18662 1726867322.35060: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:22:02 -0400 (0:00:00.105) 0:00:16.988 ****** 18662 1726867322.35343: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 18662 1726867322.35345: Creating lock for fedora.linux_system_roles.network_connections 18662 1726867322.36315: worker is 1 (out of 1 available) 18662 1726867322.36326: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 18662 1726867322.36336: done queuing things up, now waiting for results queue to drain 18662 1726867322.36337: waiting for pending results... 18662 1726867322.36685: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18662 1726867322.36691: in run() - task 0affcac9-a3a5-efab-a8ce-000000000029 18662 1726867322.36694: variable 'ansible_search_path' from source: unknown 18662 1726867322.36696: variable 'ansible_search_path' from source: unknown 18662 1726867322.36814: calling self._execute() 18662 1726867322.37021: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867322.37033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867322.37048: variable 'omit' from source: magic vars 18662 1726867322.37960: variable 'ansible_distribution_major_version' from source: facts 18662 1726867322.38069: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867322.38073: variable 'omit' from source: magic vars 18662 1726867322.38075: variable 'omit' from source: magic vars 18662 1726867322.38365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867322.41427: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867322.41496: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867322.41543: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867322.41582: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867322.41612: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867322.41694: variable 'network_provider' from source: set_fact 18662 1726867322.41911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867322.42695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867322.42727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867322.42768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867322.42904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867322.42994: variable 'omit' from source: magic vars 18662 1726867322.43126: variable 'omit' from source: magic vars 18662 1726867322.43535: variable 'network_connections' from source: play vars 18662 1726867322.43539: variable 'interface' from source: set_fact 18662 1726867322.43802: variable 'interface' from source: set_fact 18662 1726867322.43985: variable 'interface' from source: set_fact 18662 1726867322.43988: variable 'interface' from source: set_fact 18662 1726867322.44107: variable 'omit' from source: magic vars 18662 1726867322.44115: variable '__lsr_ansible_managed' from source: task vars 18662 1726867322.44175: variable '__lsr_ansible_managed' from source: task vars 18662 1726867322.44665: Loaded config def from plugin (lookup/template) 18662 1726867322.44792: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 18662 1726867322.44849: File lookup term: get_ansible_managed.j2 18662 1726867322.44854: variable 'ansible_search_path' from source: unknown 18662 1726867322.44857: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 18662 1726867322.44973: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 18662 1726867322.44976: variable 'ansible_search_path' from source: unknown 18662 1726867322.52783: variable 'ansible_managed' from source: unknown 18662 1726867322.52924: variable 'omit' from source: magic vars 18662 1726867322.52950: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867322.52978: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867322.53485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867322.53488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867322.53490: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867322.53492: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867322.53495: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867322.53497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867322.53500: Set connection var ansible_timeout to 10 18662 1726867322.53503: Set connection var ansible_connection to ssh 18662 1726867322.53505: Set connection var ansible_shell_executable to /bin/sh 18662 1726867322.53511: Set connection var ansible_shell_type to sh 18662 1726867322.53514: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867322.53516: Set connection var ansible_pipelining to False 18662 1726867322.53535: variable 'ansible_shell_executable' from source: unknown 18662 1726867322.53539: variable 'ansible_connection' from source: unknown 18662 1726867322.53542: variable 'ansible_module_compression' from source: unknown 18662 1726867322.53544: variable 'ansible_shell_type' from source: unknown 18662 1726867322.53547: variable 'ansible_shell_executable' from source: unknown 18662 1726867322.53549: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867322.53551: variable 'ansible_pipelining' from source: unknown 18662 1726867322.53554: variable 'ansible_timeout' from source: unknown 18662 1726867322.53556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867322.53966: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867322.53979: variable 'omit' from source: magic vars 18662 1726867322.53982: starting attempt loop 18662 1726867322.53985: running the handler 18662 1726867322.53987: _low_level_execute_command(): starting 18662 1726867322.53989: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867322.54725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867322.54829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867322.54888: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867322.54924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867322.56856: stdout chunk (state=3): >>>/root <<< 18662 1726867322.56859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867322.56862: stderr chunk (state=3): >>><<< 18662 1726867322.56865: stdout chunk (state=3): >>><<< 18662 1726867322.56867: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867322.56869: _low_level_execute_command(): starting 18662 1726867322.56872: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867322.5684912-19438-37286725123589 `" && echo ansible-tmp-1726867322.5684912-19438-37286725123589="` echo /root/.ansible/tmp/ansible-tmp-1726867322.5684912-19438-37286725123589 `" ) && sleep 0' 18662 1726867322.58051: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867322.58064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867322.58079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867322.58096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867322.58185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867322.60275: stdout chunk (state=3): >>>ansible-tmp-1726867322.5684912-19438-37286725123589=/root/.ansible/tmp/ansible-tmp-1726867322.5684912-19438-37286725123589 <<< 18662 1726867322.60320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867322.60427: stderr chunk (state=3): >>><<< 18662 1726867322.60431: stdout chunk (state=3): >>><<< 18662 1726867322.60434: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867322.5684912-19438-37286725123589=/root/.ansible/tmp/ansible-tmp-1726867322.5684912-19438-37286725123589 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867322.60437: variable 'ansible_module_compression' from source: unknown 18662 1726867322.60486: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 18662 1726867322.60498: ANSIBALLZ: Acquiring lock 18662 1726867322.60511: ANSIBALLZ: Lock acquired: 140264017103808 18662 1726867322.60521: ANSIBALLZ: Creating module 18662 1726867322.82112: ANSIBALLZ: Writing module into payload 18662 1726867322.82887: ANSIBALLZ: Writing module 18662 1726867322.82891: ANSIBALLZ: Renaming module 18662 1726867322.82893: ANSIBALLZ: Done creating module 18662 1726867322.82895: variable 'ansible_facts' from source: unknown 18662 1726867322.83025: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867322.5684912-19438-37286725123589/AnsiballZ_network_connections.py 18662 1726867322.83128: Sending initial data 18662 1726867322.83131: Sent initial data (167 bytes) 18662 1726867322.83772: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867322.83829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867322.83849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867322.83883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867322.84041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867322.85646: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867322.85698: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867322.85754: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmppv6jjbg5 /root/.ansible/tmp/ansible-tmp-1726867322.5684912-19438-37286725123589/AnsiballZ_network_connections.py <<< 18662 1726867322.85758: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867322.5684912-19438-37286725123589/AnsiballZ_network_connections.py" <<< 18662 1726867322.85821: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmppv6jjbg5" to remote "/root/.ansible/tmp/ansible-tmp-1726867322.5684912-19438-37286725123589/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867322.5684912-19438-37286725123589/AnsiballZ_network_connections.py" <<< 18662 1726867322.88227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867322.88230: stderr chunk (state=3): >>><<< 18662 1726867322.88233: stdout chunk (state=3): >>><<< 18662 1726867322.88235: done transferring module to remote 18662 1726867322.88237: _low_level_execute_command(): starting 18662 1726867322.88240: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867322.5684912-19438-37286725123589/ /root/.ansible/tmp/ansible-tmp-1726867322.5684912-19438-37286725123589/AnsiballZ_network_connections.py && sleep 0' 18662 1726867322.89324: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867322.89492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867322.89500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867322.89563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867322.89572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867322.91591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867322.91595: stdout chunk (state=3): >>><<< 18662 1726867322.91601: stderr chunk (state=3): >>><<< 18662 1726867322.91618: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867322.91621: _low_level_execute_command(): starting 18662 1726867322.91630: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867322.5684912-19438-37286725123589/AnsiballZ_network_connections.py && sleep 0' 18662 1726867322.92783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867322.92787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867322.92790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867322.92792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867322.92794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867322.92884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867322.93181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867323.39496: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 9e3fe38f-c3cf-40e1-9296-1c3613d05895\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 9e3fe38f-c3cf-40e1-9296-1c3613d05895 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 18662 1726867323.41699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867323.41703: stdout chunk (state=3): >>><<< 18662 1726867323.41706: stderr chunk (state=3): >>><<< 18662 1726867323.41712: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 9e3fe38f-c3cf-40e1-9296-1c3613d05895\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 9e3fe38f-c3cf-40e1-9296-1c3613d05895 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867323.41715: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'interface_name': 'lsr27', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'address': '192.0.2.1/24'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867322.5684912-19438-37286725123589/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867323.41718: _low_level_execute_command(): starting 18662 1726867323.41720: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867322.5684912-19438-37286725123589/ > /dev/null 2>&1 && sleep 0' 18662 1726867323.43084: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867323.43250: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867323.43303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867323.43369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867323.43388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867323.43602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867323.45554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867323.45567: stdout chunk (state=3): >>><<< 18662 1726867323.45590: stderr chunk (state=3): >>><<< 18662 1726867323.45674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867323.46046: handler run complete 18662 1726867323.46050: attempt loop complete, returning result 18662 1726867323.46052: _execute() done 18662 1726867323.46055: dumping result to json 18662 1726867323.46057: done dumping result, returning 18662 1726867323.46059: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-efab-a8ce-000000000029] 18662 1726867323.46061: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000029 18662 1726867323.46140: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000029 18662 1726867323.46144: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 9e3fe38f-c3cf-40e1-9296-1c3613d05895 [004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 9e3fe38f-c3cf-40e1-9296-1c3613d05895 (not-active) 18662 1726867323.46266: no more pending results, returning what we have 18662 1726867323.46269: results queue empty 18662 1726867323.46270: checking for any_errors_fatal 18662 1726867323.46680: done checking for any_errors_fatal 18662 1726867323.46682: checking for max_fail_percentage 18662 1726867323.46684: done checking for max_fail_percentage 18662 1726867323.46685: checking to see if all hosts have failed and the running result is not ok 18662 1726867323.46686: done checking to see if all hosts have failed 18662 1726867323.46687: getting the remaining hosts for this loop 18662 1726867323.46688: done getting the remaining hosts for this loop 18662 1726867323.46692: getting the next task for host managed_node2 18662 1726867323.46699: done getting next task for host managed_node2 18662 1726867323.46703: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18662 1726867323.46705: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867323.46718: getting variables 18662 1726867323.46721: in VariableManager get_vars() 18662 1726867323.46760: Calling all_inventory to load vars for managed_node2 18662 1726867323.46762: Calling groups_inventory to load vars for managed_node2 18662 1726867323.46765: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867323.46775: Calling all_plugins_play to load vars for managed_node2 18662 1726867323.47121: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867323.47126: Calling groups_plugins_play to load vars for managed_node2 18662 1726867323.50220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867323.52593: done with get_vars() 18662 1726867323.52616: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:22:03 -0400 (0:00:01.173) 0:00:18.162 ****** 18662 1726867323.52699: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 18662 1726867323.52700: Creating lock for fedora.linux_system_roles.network_state 18662 1726867323.53322: worker is 1 (out of 1 available) 18662 1726867323.53334: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 18662 1726867323.53345: done queuing things up, now waiting for results queue to drain 18662 1726867323.53346: waiting for pending results... 18662 1726867323.53729: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 18662 1726867323.53787: in run() - task 0affcac9-a3a5-efab-a8ce-00000000002a 18662 1726867323.53805: variable 'ansible_search_path' from source: unknown 18662 1726867323.53824: variable 'ansible_search_path' from source: unknown 18662 1726867323.53866: calling self._execute() 18662 1726867323.53983: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867323.54040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867323.54067: variable 'omit' from source: magic vars 18662 1726867323.54515: variable 'ansible_distribution_major_version' from source: facts 18662 1726867323.54533: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867323.54704: variable 'network_state' from source: role '' defaults 18662 1726867323.54803: Evaluated conditional (network_state != {}): False 18662 1726867323.54807: when evaluation is False, skipping this task 18662 1726867323.54813: _execute() done 18662 1726867323.54815: dumping result to json 18662 1726867323.54818: done dumping result, returning 18662 1726867323.54820: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-efab-a8ce-00000000002a] 18662 1726867323.54822: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000002a 18662 1726867323.54929: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000002a 18662 1726867323.54933: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18662 1726867323.54985: no more pending results, returning what we have 18662 1726867323.54989: results queue empty 18662 1726867323.54989: checking for any_errors_fatal 18662 1726867323.55001: done checking for any_errors_fatal 18662 1726867323.55002: checking for max_fail_percentage 18662 1726867323.55004: done checking for max_fail_percentage 18662 1726867323.55005: checking to see if all hosts have failed and the running result is not ok 18662 1726867323.55005: done checking to see if all hosts have failed 18662 1726867323.55006: getting the remaining hosts for this loop 18662 1726867323.55007: done getting the remaining hosts for this loop 18662 1726867323.55011: getting the next task for host managed_node2 18662 1726867323.55020: done getting next task for host managed_node2 18662 1726867323.55024: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18662 1726867323.55026: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867323.55040: getting variables 18662 1726867323.55041: in VariableManager get_vars() 18662 1726867323.55075: Calling all_inventory to load vars for managed_node2 18662 1726867323.55079: Calling groups_inventory to load vars for managed_node2 18662 1726867323.55082: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867323.55092: Calling all_plugins_play to load vars for managed_node2 18662 1726867323.55094: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867323.55097: Calling groups_plugins_play to load vars for managed_node2 18662 1726867323.57651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867323.59643: done with get_vars() 18662 1726867323.59679: done getting variables 18662 1726867323.59746: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:22:03 -0400 (0:00:00.070) 0:00:18.233 ****** 18662 1726867323.59782: entering _queue_task() for managed_node2/debug 18662 1726867323.60196: worker is 1 (out of 1 available) 18662 1726867323.60212: exiting _queue_task() for managed_node2/debug 18662 1726867323.60227: done queuing things up, now waiting for results queue to drain 18662 1726867323.60228: waiting for pending results... 18662 1726867323.60679: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18662 1726867323.60712: in run() - task 0affcac9-a3a5-efab-a8ce-00000000002b 18662 1726867323.60737: variable 'ansible_search_path' from source: unknown 18662 1726867323.60745: variable 'ansible_search_path' from source: unknown 18662 1726867323.60883: calling self._execute() 18662 1726867323.60922: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867323.60935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867323.60951: variable 'omit' from source: magic vars 18662 1726867323.61380: variable 'ansible_distribution_major_version' from source: facts 18662 1726867323.61397: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867323.61407: variable 'omit' from source: magic vars 18662 1726867323.61462: variable 'omit' from source: magic vars 18662 1726867323.61506: variable 'omit' from source: magic vars 18662 1726867323.61563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867323.61628: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867323.61760: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867323.61763: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867323.61766: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867323.61768: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867323.61772: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867323.61774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867323.61906: Set connection var ansible_timeout to 10 18662 1726867323.61918: Set connection var ansible_connection to ssh 18662 1726867323.61930: Set connection var ansible_shell_executable to /bin/sh 18662 1726867323.61937: Set connection var ansible_shell_type to sh 18662 1726867323.61951: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867323.61960: Set connection var ansible_pipelining to False 18662 1726867323.61990: variable 'ansible_shell_executable' from source: unknown 18662 1726867323.62013: variable 'ansible_connection' from source: unknown 18662 1726867323.62114: variable 'ansible_module_compression' from source: unknown 18662 1726867323.62117: variable 'ansible_shell_type' from source: unknown 18662 1726867323.62119: variable 'ansible_shell_executable' from source: unknown 18662 1726867323.62121: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867323.62123: variable 'ansible_pipelining' from source: unknown 18662 1726867323.62125: variable 'ansible_timeout' from source: unknown 18662 1726867323.62127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867323.62225: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867323.62250: variable 'omit' from source: magic vars 18662 1726867323.62327: starting attempt loop 18662 1726867323.62330: running the handler 18662 1726867323.62414: variable '__network_connections_result' from source: set_fact 18662 1726867323.62490: handler run complete 18662 1726867323.62516: attempt loop complete, returning result 18662 1726867323.62524: _execute() done 18662 1726867323.62532: dumping result to json 18662 1726867323.62542: done dumping result, returning 18662 1726867323.62559: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-efab-a8ce-00000000002b] 18662 1726867323.62656: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000002b 18662 1726867323.62732: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000002b 18662 1726867323.62735: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 9e3fe38f-c3cf-40e1-9296-1c3613d05895", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 9e3fe38f-c3cf-40e1-9296-1c3613d05895 (not-active)" ] } 18662 1726867323.62807: no more pending results, returning what we have 18662 1726867323.62814: results queue empty 18662 1726867323.62814: checking for any_errors_fatal 18662 1726867323.62820: done checking for any_errors_fatal 18662 1726867323.62821: checking for max_fail_percentage 18662 1726867323.62823: done checking for max_fail_percentage 18662 1726867323.62824: checking to see if all hosts have failed and the running result is not ok 18662 1726867323.62825: done checking to see if all hosts have failed 18662 1726867323.62825: getting the remaining hosts for this loop 18662 1726867323.62827: done getting the remaining hosts for this loop 18662 1726867323.62831: getting the next task for host managed_node2 18662 1726867323.62838: done getting next task for host managed_node2 18662 1726867323.62842: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18662 1726867323.63080: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867323.63092: getting variables 18662 1726867323.63093: in VariableManager get_vars() 18662 1726867323.63130: Calling all_inventory to load vars for managed_node2 18662 1726867323.63132: Calling groups_inventory to load vars for managed_node2 18662 1726867323.63135: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867323.63144: Calling all_plugins_play to load vars for managed_node2 18662 1726867323.63147: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867323.63150: Calling groups_plugins_play to load vars for managed_node2 18662 1726867323.65239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867323.67285: done with get_vars() 18662 1726867323.67313: done getting variables 18662 1726867323.67387: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:22:03 -0400 (0:00:00.076) 0:00:18.309 ****** 18662 1726867323.67422: entering _queue_task() for managed_node2/debug 18662 1726867323.67815: worker is 1 (out of 1 available) 18662 1726867323.67830: exiting _queue_task() for managed_node2/debug 18662 1726867323.67843: done queuing things up, now waiting for results queue to drain 18662 1726867323.67845: waiting for pending results... 18662 1726867323.68327: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18662 1726867323.68383: in run() - task 0affcac9-a3a5-efab-a8ce-00000000002c 18662 1726867323.68387: variable 'ansible_search_path' from source: unknown 18662 1726867323.68390: variable 'ansible_search_path' from source: unknown 18662 1726867323.68429: calling self._execute() 18662 1726867323.68537: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867323.68582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867323.68586: variable 'omit' from source: magic vars 18662 1726867323.69272: variable 'ansible_distribution_major_version' from source: facts 18662 1726867323.69396: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867323.69399: variable 'omit' from source: magic vars 18662 1726867323.69402: variable 'omit' from source: magic vars 18662 1726867323.69413: variable 'omit' from source: magic vars 18662 1726867323.69458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867323.69515: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867323.69584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867323.69588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867323.69591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867323.69684: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867323.69687: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867323.69690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867323.69758: Set connection var ansible_timeout to 10 18662 1726867323.69767: Set connection var ansible_connection to ssh 18662 1726867323.69780: Set connection var ansible_shell_executable to /bin/sh 18662 1726867323.69788: Set connection var ansible_shell_type to sh 18662 1726867323.69803: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867323.69828: Set connection var ansible_pipelining to False 18662 1726867323.69858: variable 'ansible_shell_executable' from source: unknown 18662 1726867323.69926: variable 'ansible_connection' from source: unknown 18662 1726867323.69935: variable 'ansible_module_compression' from source: unknown 18662 1726867323.69938: variable 'ansible_shell_type' from source: unknown 18662 1726867323.69941: variable 'ansible_shell_executable' from source: unknown 18662 1726867323.69943: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867323.69945: variable 'ansible_pipelining' from source: unknown 18662 1726867323.69946: variable 'ansible_timeout' from source: unknown 18662 1726867323.69948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867323.70069: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867323.70091: variable 'omit' from source: magic vars 18662 1726867323.70103: starting attempt loop 18662 1726867323.70114: running the handler 18662 1726867323.70179: variable '__network_connections_result' from source: set_fact 18662 1726867323.70274: variable '__network_connections_result' from source: set_fact 18662 1726867323.70471: handler run complete 18662 1726867323.70475: attempt loop complete, returning result 18662 1726867323.70479: _execute() done 18662 1726867323.70481: dumping result to json 18662 1726867323.70483: done dumping result, returning 18662 1726867323.70486: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-efab-a8ce-00000000002c] 18662 1726867323.70489: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000002c 18662 1726867323.70653: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000002c 18662 1726867323.70656: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 9e3fe38f-c3cf-40e1-9296-1c3613d05895\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 9e3fe38f-c3cf-40e1-9296-1c3613d05895 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 9e3fe38f-c3cf-40e1-9296-1c3613d05895", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 9e3fe38f-c3cf-40e1-9296-1c3613d05895 (not-active)" ] } } 18662 1726867323.70764: no more pending results, returning what we have 18662 1726867323.70767: results queue empty 18662 1726867323.70768: checking for any_errors_fatal 18662 1726867323.70775: done checking for any_errors_fatal 18662 1726867323.70776: checking for max_fail_percentage 18662 1726867323.70897: done checking for max_fail_percentage 18662 1726867323.70898: checking to see if all hosts have failed and the running result is not ok 18662 1726867323.70899: done checking to see if all hosts have failed 18662 1726867323.70899: getting the remaining hosts for this loop 18662 1726867323.70901: done getting the remaining hosts for this loop 18662 1726867323.70905: getting the next task for host managed_node2 18662 1726867323.70913: done getting next task for host managed_node2 18662 1726867323.70916: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18662 1726867323.70919: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867323.70929: getting variables 18662 1726867323.70931: in VariableManager get_vars() 18662 1726867323.70964: Calling all_inventory to load vars for managed_node2 18662 1726867323.70966: Calling groups_inventory to load vars for managed_node2 18662 1726867323.70969: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867323.71064: Calling all_plugins_play to load vars for managed_node2 18662 1726867323.71069: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867323.71082: Calling groups_plugins_play to load vars for managed_node2 18662 1726867323.72695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867323.75336: done with get_vars() 18662 1726867323.75359: done getting variables 18662 1726867323.75432: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:22:03 -0400 (0:00:00.080) 0:00:18.390 ****** 18662 1726867323.75464: entering _queue_task() for managed_node2/debug 18662 1726867323.75824: worker is 1 (out of 1 available) 18662 1726867323.75837: exiting _queue_task() for managed_node2/debug 18662 1726867323.75849: done queuing things up, now waiting for results queue to drain 18662 1726867323.75850: waiting for pending results... 18662 1726867323.76244: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18662 1726867323.76249: in run() - task 0affcac9-a3a5-efab-a8ce-00000000002d 18662 1726867323.76251: variable 'ansible_search_path' from source: unknown 18662 1726867323.76261: variable 'ansible_search_path' from source: unknown 18662 1726867323.76300: calling self._execute() 18662 1726867323.76401: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867323.76415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867323.76427: variable 'omit' from source: magic vars 18662 1726867323.76823: variable 'ansible_distribution_major_version' from source: facts 18662 1726867323.76839: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867323.76971: variable 'network_state' from source: role '' defaults 18662 1726867323.77020: Evaluated conditional (network_state != {}): False 18662 1726867323.77023: when evaluation is False, skipping this task 18662 1726867323.77026: _execute() done 18662 1726867323.77028: dumping result to json 18662 1726867323.77030: done dumping result, returning 18662 1726867323.77037: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-efab-a8ce-00000000002d] 18662 1726867323.77047: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000002d 18662 1726867323.77286: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000002d 18662 1726867323.77289: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 18662 1726867323.77343: no more pending results, returning what we have 18662 1726867323.77348: results queue empty 18662 1726867323.77349: checking for any_errors_fatal 18662 1726867323.77355: done checking for any_errors_fatal 18662 1726867323.77355: checking for max_fail_percentage 18662 1726867323.77357: done checking for max_fail_percentage 18662 1726867323.77358: checking to see if all hosts have failed and the running result is not ok 18662 1726867323.77359: done checking to see if all hosts have failed 18662 1726867323.77360: getting the remaining hosts for this loop 18662 1726867323.77361: done getting the remaining hosts for this loop 18662 1726867323.77364: getting the next task for host managed_node2 18662 1726867323.77372: done getting next task for host managed_node2 18662 1726867323.77376: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18662 1726867323.77381: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867323.77394: getting variables 18662 1726867323.77396: in VariableManager get_vars() 18662 1726867323.77439: Calling all_inventory to load vars for managed_node2 18662 1726867323.77442: Calling groups_inventory to load vars for managed_node2 18662 1726867323.77445: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867323.77458: Calling all_plugins_play to load vars for managed_node2 18662 1726867323.77461: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867323.77464: Calling groups_plugins_play to load vars for managed_node2 18662 1726867323.79030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867323.80758: done with get_vars() 18662 1726867323.80783: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:22:03 -0400 (0:00:00.054) 0:00:18.444 ****** 18662 1726867323.80893: entering _queue_task() for managed_node2/ping 18662 1726867323.80895: Creating lock for ping 18662 1726867323.81211: worker is 1 (out of 1 available) 18662 1726867323.81223: exiting _queue_task() for managed_node2/ping 18662 1726867323.81238: done queuing things up, now waiting for results queue to drain 18662 1726867323.81240: waiting for pending results... 18662 1726867323.81467: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 18662 1726867323.81563: in run() - task 0affcac9-a3a5-efab-a8ce-00000000002e 18662 1726867323.81567: variable 'ansible_search_path' from source: unknown 18662 1726867323.81570: variable 'ansible_search_path' from source: unknown 18662 1726867323.81573: calling self._execute() 18662 1726867323.81631: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867323.81637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867323.81646: variable 'omit' from source: magic vars 18662 1726867323.82016: variable 'ansible_distribution_major_version' from source: facts 18662 1726867323.82032: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867323.82082: variable 'omit' from source: magic vars 18662 1726867323.82086: variable 'omit' from source: magic vars 18662 1726867323.82116: variable 'omit' from source: magic vars 18662 1726867323.82159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867323.82212: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867323.82216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867323.82252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867323.82255: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867323.82274: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867323.82278: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867323.82281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867323.82425: Set connection var ansible_timeout to 10 18662 1726867323.82428: Set connection var ansible_connection to ssh 18662 1726867323.82434: Set connection var ansible_shell_executable to /bin/sh 18662 1726867323.82436: Set connection var ansible_shell_type to sh 18662 1726867323.82438: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867323.82441: Set connection var ansible_pipelining to False 18662 1726867323.82443: variable 'ansible_shell_executable' from source: unknown 18662 1726867323.82445: variable 'ansible_connection' from source: unknown 18662 1726867323.82447: variable 'ansible_module_compression' from source: unknown 18662 1726867323.82449: variable 'ansible_shell_type' from source: unknown 18662 1726867323.82451: variable 'ansible_shell_executable' from source: unknown 18662 1726867323.82454: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867323.82456: variable 'ansible_pipelining' from source: unknown 18662 1726867323.82458: variable 'ansible_timeout' from source: unknown 18662 1726867323.82460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867323.82752: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867323.82757: variable 'omit' from source: magic vars 18662 1726867323.82759: starting attempt loop 18662 1726867323.82761: running the handler 18662 1726867323.82763: _low_level_execute_command(): starting 18662 1726867323.82765: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867323.83602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867323.83693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867323.83727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867323.83730: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867323.83769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867323.83828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867323.85541: stdout chunk (state=3): >>>/root <<< 18662 1726867323.85685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867323.85688: stdout chunk (state=3): >>><<< 18662 1726867323.85690: stderr chunk (state=3): >>><<< 18662 1726867323.85706: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867323.85723: _low_level_execute_command(): starting 18662 1726867323.85795: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867323.8571174-19513-12417689666972 `" && echo ansible-tmp-1726867323.8571174-19513-12417689666972="` echo /root/.ansible/tmp/ansible-tmp-1726867323.8571174-19513-12417689666972 `" ) && sleep 0' 18662 1726867323.86329: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867323.86343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867323.86361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867323.86450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867323.86497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867323.86514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867323.86544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867323.86622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867323.88640: stdout chunk (state=3): >>>ansible-tmp-1726867323.8571174-19513-12417689666972=/root/.ansible/tmp/ansible-tmp-1726867323.8571174-19513-12417689666972 <<< 18662 1726867323.88796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867323.88800: stdout chunk (state=3): >>><<< 18662 1726867323.88802: stderr chunk (state=3): >>><<< 18662 1726867323.88822: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867323.8571174-19513-12417689666972=/root/.ansible/tmp/ansible-tmp-1726867323.8571174-19513-12417689666972 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867323.88883: variable 'ansible_module_compression' from source: unknown 18662 1726867323.88931: ANSIBALLZ: Using lock for ping 18662 1726867323.88982: ANSIBALLZ: Acquiring lock 18662 1726867323.88985: ANSIBALLZ: Lock acquired: 140264021008384 18662 1726867323.88988: ANSIBALLZ: Creating module 18662 1726867324.01980: ANSIBALLZ: Writing module into payload 18662 1726867324.02046: ANSIBALLZ: Writing module 18662 1726867324.02084: ANSIBALLZ: Renaming module 18662 1726867324.02096: ANSIBALLZ: Done creating module 18662 1726867324.02116: variable 'ansible_facts' from source: unknown 18662 1726867324.02288: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867323.8571174-19513-12417689666972/AnsiballZ_ping.py 18662 1726867324.02420: Sending initial data 18662 1726867324.02424: Sent initial data (152 bytes) 18662 1726867324.02999: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867324.03015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867324.03058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18662 1726867324.03071: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 18662 1726867324.03083: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18662 1726867324.03172: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867324.03190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867324.03205: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867324.03226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867324.03408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867324.05116: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867324.05151: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867324.05237: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpcsshivcz /root/.ansible/tmp/ansible-tmp-1726867323.8571174-19513-12417689666972/AnsiballZ_ping.py <<< 18662 1726867324.05241: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867323.8571174-19513-12417689666972/AnsiballZ_ping.py" <<< 18662 1726867324.05447: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpcsshivcz" to remote "/root/.ansible/tmp/ansible-tmp-1726867323.8571174-19513-12417689666972/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867323.8571174-19513-12417689666972/AnsiballZ_ping.py" <<< 18662 1726867324.06517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867324.06521: stdout chunk (state=3): >>><<< 18662 1726867324.06523: stderr chunk (state=3): >>><<< 18662 1726867324.06526: done transferring module to remote 18662 1726867324.06687: _low_level_execute_command(): starting 18662 1726867324.06691: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867323.8571174-19513-12417689666972/ /root/.ansible/tmp/ansible-tmp-1726867323.8571174-19513-12417689666972/AnsiballZ_ping.py && sleep 0' 18662 1726867324.07690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867324.07722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867324.07802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867324.07822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867324.07924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867324.07949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867324.08018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867324.09918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867324.09922: stdout chunk (state=3): >>><<< 18662 1726867324.09924: stderr chunk (state=3): >>><<< 18662 1726867324.09940: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867324.10019: _low_level_execute_command(): starting 18662 1726867324.10022: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867323.8571174-19513-12417689666972/AnsiballZ_ping.py && sleep 0' 18662 1726867324.10529: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867324.10545: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867324.10560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867324.10586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867324.10683: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867324.10704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867324.10719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867324.10800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867324.26175: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 18662 1726867324.27558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867324.27562: stdout chunk (state=3): >>><<< 18662 1726867324.27568: stderr chunk (state=3): >>><<< 18662 1726867324.27603: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867324.27784: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867323.8571174-19513-12417689666972/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867324.27788: _low_level_execute_command(): starting 18662 1726867324.27791: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867323.8571174-19513-12417689666972/ > /dev/null 2>&1 && sleep 0' 18662 1726867324.28307: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867324.28317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867324.28327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867324.28353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867324.28364: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867324.28370: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867324.28459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867324.28475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867324.28493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867324.28509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867324.28580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867324.30472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867324.30522: stderr chunk (state=3): >>><<< 18662 1726867324.30540: stdout chunk (state=3): >>><<< 18662 1726867324.30785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867324.30789: handler run complete 18662 1726867324.30792: attempt loop complete, returning result 18662 1726867324.30794: _execute() done 18662 1726867324.30797: dumping result to json 18662 1726867324.30799: done dumping result, returning 18662 1726867324.30801: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-efab-a8ce-00000000002e] 18662 1726867324.30803: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000002e 18662 1726867324.30872: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000002e 18662 1726867324.30875: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 18662 1726867324.30940: no more pending results, returning what we have 18662 1726867324.30944: results queue empty 18662 1726867324.30945: checking for any_errors_fatal 18662 1726867324.30952: done checking for any_errors_fatal 18662 1726867324.30953: checking for max_fail_percentage 18662 1726867324.30955: done checking for max_fail_percentage 18662 1726867324.30956: checking to see if all hosts have failed and the running result is not ok 18662 1726867324.30956: done checking to see if all hosts have failed 18662 1726867324.30957: getting the remaining hosts for this loop 18662 1726867324.30959: done getting the remaining hosts for this loop 18662 1726867324.30962: getting the next task for host managed_node2 18662 1726867324.30972: done getting next task for host managed_node2 18662 1726867324.30974: ^ task is: TASK: meta (role_complete) 18662 1726867324.30976: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867324.30995: getting variables 18662 1726867324.30997: in VariableManager get_vars() 18662 1726867324.31038: Calling all_inventory to load vars for managed_node2 18662 1726867324.31041: Calling groups_inventory to load vars for managed_node2 18662 1726867324.31044: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867324.31055: Calling all_plugins_play to load vars for managed_node2 18662 1726867324.31058: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867324.31061: Calling groups_plugins_play to load vars for managed_node2 18662 1726867324.33042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867324.34924: done with get_vars() 18662 1726867324.34948: done getting variables 18662 1726867324.35247: done queuing things up, now waiting for results queue to drain 18662 1726867324.35249: results queue empty 18662 1726867324.35250: checking for any_errors_fatal 18662 1726867324.35252: done checking for any_errors_fatal 18662 1726867324.35254: checking for max_fail_percentage 18662 1726867324.35255: done checking for max_fail_percentage 18662 1726867324.35256: checking to see if all hosts have failed and the running result is not ok 18662 1726867324.35256: done checking to see if all hosts have failed 18662 1726867324.35257: getting the remaining hosts for this loop 18662 1726867324.35258: done getting the remaining hosts for this loop 18662 1726867324.35260: getting the next task for host managed_node2 18662 1726867324.35264: done getting next task for host managed_node2 18662 1726867324.35266: ^ task is: TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 18662 1726867324.35268: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867324.35270: getting variables 18662 1726867324.35271: in VariableManager get_vars() 18662 1726867324.35284: Calling all_inventory to load vars for managed_node2 18662 1726867324.35286: Calling groups_inventory to load vars for managed_node2 18662 1726867324.35287: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867324.35292: Calling all_plugins_play to load vars for managed_node2 18662 1726867324.35294: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867324.35296: Calling groups_plugins_play to load vars for managed_node2 18662 1726867324.36941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867324.38917: done with get_vars() 18662 1726867324.38937: done getting variables TASK [Include the task 'assert_output_in_stderr_without_warnings.yml'] ********* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:47 Friday 20 September 2024 17:22:04 -0400 (0:00:00.581) 0:00:19.025 ****** 18662 1726867324.39006: entering _queue_task() for managed_node2/include_tasks 18662 1726867324.39718: worker is 1 (out of 1 available) 18662 1726867324.39729: exiting _queue_task() for managed_node2/include_tasks 18662 1726867324.39740: done queuing things up, now waiting for results queue to drain 18662 1726867324.39742: waiting for pending results... 18662 1726867324.40441: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 18662 1726867324.40447: in run() - task 0affcac9-a3a5-efab-a8ce-000000000030 18662 1726867324.40450: variable 'ansible_search_path' from source: unknown 18662 1726867324.40455: calling self._execute() 18662 1726867324.40640: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867324.40868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867324.40872: variable 'omit' from source: magic vars 18662 1726867324.41484: variable 'ansible_distribution_major_version' from source: facts 18662 1726867324.41502: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867324.41520: _execute() done 18662 1726867324.41528: dumping result to json 18662 1726867324.41591: done dumping result, returning 18662 1726867324.41602: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' [0affcac9-a3a5-efab-a8ce-000000000030] 18662 1726867324.41614: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000030 18662 1726867324.41760: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000030 18662 1726867324.41764: WORKER PROCESS EXITING 18662 1726867324.41794: no more pending results, returning what we have 18662 1726867324.41799: in VariableManager get_vars() 18662 1726867324.41842: Calling all_inventory to load vars for managed_node2 18662 1726867324.41845: Calling groups_inventory to load vars for managed_node2 18662 1726867324.41847: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867324.41861: Calling all_plugins_play to load vars for managed_node2 18662 1726867324.41864: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867324.41867: Calling groups_plugins_play to load vars for managed_node2 18662 1726867324.44459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867324.46522: done with get_vars() 18662 1726867324.46544: variable 'ansible_search_path' from source: unknown 18662 1726867324.46561: we have included files to process 18662 1726867324.46562: generating all_blocks data 18662 1726867324.46564: done generating all_blocks data 18662 1726867324.46569: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 18662 1726867324.46570: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 18662 1726867324.46572: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 18662 1726867324.47322: done processing included file 18662 1726867324.47324: iterating over new_blocks loaded from include file 18662 1726867324.47325: in VariableManager get_vars() 18662 1726867324.47341: done with get_vars() 18662 1726867324.47343: filtering new block on tags 18662 1726867324.47360: done filtering new block on tags 18662 1726867324.47362: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml for managed_node2 18662 1726867324.47367: extending task lists for all hosts with included blocks 18662 1726867324.47399: done extending task lists 18662 1726867324.47400: done processing included files 18662 1726867324.47401: results queue empty 18662 1726867324.47401: checking for any_errors_fatal 18662 1726867324.47403: done checking for any_errors_fatal 18662 1726867324.47404: checking for max_fail_percentage 18662 1726867324.47405: done checking for max_fail_percentage 18662 1726867324.47406: checking to see if all hosts have failed and the running result is not ok 18662 1726867324.47406: done checking to see if all hosts have failed 18662 1726867324.47407: getting the remaining hosts for this loop 18662 1726867324.47408: done getting the remaining hosts for this loop 18662 1726867324.47410: getting the next task for host managed_node2 18662 1726867324.47414: done getting next task for host managed_node2 18662 1726867324.47416: ^ task is: TASK: Assert that warnings is empty 18662 1726867324.47418: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867324.47420: getting variables 18662 1726867324.47421: in VariableManager get_vars() 18662 1726867324.47434: Calling all_inventory to load vars for managed_node2 18662 1726867324.47436: Calling groups_inventory to load vars for managed_node2 18662 1726867324.47438: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867324.47444: Calling all_plugins_play to load vars for managed_node2 18662 1726867324.47446: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867324.47449: Calling groups_plugins_play to load vars for managed_node2 18662 1726867324.49911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867324.53085: done with get_vars() 18662 1726867324.53194: done getting variables 18662 1726867324.53415: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that warnings is empty] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:3 Friday 20 September 2024 17:22:04 -0400 (0:00:00.144) 0:00:19.169 ****** 18662 1726867324.53446: entering _queue_task() for managed_node2/assert 18662 1726867324.54138: worker is 1 (out of 1 available) 18662 1726867324.54151: exiting _queue_task() for managed_node2/assert 18662 1726867324.54162: done queuing things up, now waiting for results queue to drain 18662 1726867324.54164: waiting for pending results... 18662 1726867324.54591: running TaskExecutor() for managed_node2/TASK: Assert that warnings is empty 18662 1726867324.54767: in run() - task 0affcac9-a3a5-efab-a8ce-000000000304 18662 1726867324.54782: variable 'ansible_search_path' from source: unknown 18662 1726867324.54790: variable 'ansible_search_path' from source: unknown 18662 1726867324.54818: calling self._execute() 18662 1726867324.55018: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867324.55024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867324.55035: variable 'omit' from source: magic vars 18662 1726867324.55828: variable 'ansible_distribution_major_version' from source: facts 18662 1726867324.55839: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867324.55845: variable 'omit' from source: magic vars 18662 1726867324.55994: variable 'omit' from source: magic vars 18662 1726867324.56103: variable 'omit' from source: magic vars 18662 1726867324.56107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867324.56113: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867324.56115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867324.56132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867324.56258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867324.56365: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867324.56368: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867324.56371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867324.56582: Set connection var ansible_timeout to 10 18662 1726867324.56585: Set connection var ansible_connection to ssh 18662 1726867324.56591: Set connection var ansible_shell_executable to /bin/sh 18662 1726867324.56593: Set connection var ansible_shell_type to sh 18662 1726867324.56604: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867324.56611: Set connection var ansible_pipelining to False 18662 1726867324.56633: variable 'ansible_shell_executable' from source: unknown 18662 1726867324.56636: variable 'ansible_connection' from source: unknown 18662 1726867324.56754: variable 'ansible_module_compression' from source: unknown 18662 1726867324.56759: variable 'ansible_shell_type' from source: unknown 18662 1726867324.56761: variable 'ansible_shell_executable' from source: unknown 18662 1726867324.56763: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867324.56765: variable 'ansible_pipelining' from source: unknown 18662 1726867324.56766: variable 'ansible_timeout' from source: unknown 18662 1726867324.56768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867324.56896: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867324.56994: variable 'omit' from source: magic vars 18662 1726867324.57084: starting attempt loop 18662 1726867324.57087: running the handler 18662 1726867324.57417: variable '__network_connections_result' from source: set_fact 18662 1726867324.57431: Evaluated conditional ('warnings' not in __network_connections_result): True 18662 1726867324.57441: handler run complete 18662 1726867324.57457: attempt loop complete, returning result 18662 1726867324.57460: _execute() done 18662 1726867324.57462: dumping result to json 18662 1726867324.57465: done dumping result, returning 18662 1726867324.57472: done running TaskExecutor() for managed_node2/TASK: Assert that warnings is empty [0affcac9-a3a5-efab-a8ce-000000000304] 18662 1726867324.57479: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000304 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 18662 1726867324.57681: no more pending results, returning what we have 18662 1726867324.57685: results queue empty 18662 1726867324.57686: checking for any_errors_fatal 18662 1726867324.57688: done checking for any_errors_fatal 18662 1726867324.57688: checking for max_fail_percentage 18662 1726867324.57690: done checking for max_fail_percentage 18662 1726867324.57691: checking to see if all hosts have failed and the running result is not ok 18662 1726867324.57691: done checking to see if all hosts have failed 18662 1726867324.57692: getting the remaining hosts for this loop 18662 1726867324.57693: done getting the remaining hosts for this loop 18662 1726867324.57697: getting the next task for host managed_node2 18662 1726867324.57705: done getting next task for host managed_node2 18662 1726867324.57708: ^ task is: TASK: Assert that there is output in stderr 18662 1726867324.57713: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867324.57716: getting variables 18662 1726867324.57718: in VariableManager get_vars() 18662 1726867324.57912: Calling all_inventory to load vars for managed_node2 18662 1726867324.57914: Calling groups_inventory to load vars for managed_node2 18662 1726867324.57917: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867324.57930: Calling all_plugins_play to load vars for managed_node2 18662 1726867324.57934: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867324.57937: Calling groups_plugins_play to load vars for managed_node2 18662 1726867324.58766: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000304 18662 1726867324.58770: WORKER PROCESS EXITING 18662 1726867324.70556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867324.72786: done with get_vars() 18662 1726867324.72811: done getting variables 18662 1726867324.72879: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that there is output in stderr] *********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:8 Friday 20 September 2024 17:22:04 -0400 (0:00:00.194) 0:00:19.364 ****** 18662 1726867324.72907: entering _queue_task() for managed_node2/assert 18662 1726867324.73258: worker is 1 (out of 1 available) 18662 1726867324.73270: exiting _queue_task() for managed_node2/assert 18662 1726867324.73486: done queuing things up, now waiting for results queue to drain 18662 1726867324.73488: waiting for pending results... 18662 1726867324.73558: running TaskExecutor() for managed_node2/TASK: Assert that there is output in stderr 18662 1726867324.73673: in run() - task 0affcac9-a3a5-efab-a8ce-000000000305 18662 1726867324.73686: variable 'ansible_search_path' from source: unknown 18662 1726867324.73689: variable 'ansible_search_path' from source: unknown 18662 1726867324.73734: calling self._execute() 18662 1726867324.73833: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867324.73840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867324.73850: variable 'omit' from source: magic vars 18662 1726867324.74239: variable 'ansible_distribution_major_version' from source: facts 18662 1726867324.74259: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867324.74273: variable 'omit' from source: magic vars 18662 1726867324.74311: variable 'omit' from source: magic vars 18662 1726867324.74347: variable 'omit' from source: magic vars 18662 1726867324.74396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867324.74430: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867324.74447: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867324.74471: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867324.74483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867324.74514: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867324.74518: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867324.74522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867324.74883: Set connection var ansible_timeout to 10 18662 1726867324.74886: Set connection var ansible_connection to ssh 18662 1726867324.74888: Set connection var ansible_shell_executable to /bin/sh 18662 1726867324.74891: Set connection var ansible_shell_type to sh 18662 1726867324.74893: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867324.74898: Set connection var ansible_pipelining to False 18662 1726867324.74901: variable 'ansible_shell_executable' from source: unknown 18662 1726867324.74903: variable 'ansible_connection' from source: unknown 18662 1726867324.74906: variable 'ansible_module_compression' from source: unknown 18662 1726867324.74911: variable 'ansible_shell_type' from source: unknown 18662 1726867324.74914: variable 'ansible_shell_executable' from source: unknown 18662 1726867324.74916: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867324.74918: variable 'ansible_pipelining' from source: unknown 18662 1726867324.74920: variable 'ansible_timeout' from source: unknown 18662 1726867324.74922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867324.74925: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867324.74928: variable 'omit' from source: magic vars 18662 1726867324.74930: starting attempt loop 18662 1726867324.74932: running the handler 18662 1726867324.74934: variable '__network_connections_result' from source: set_fact 18662 1726867324.74949: Evaluated conditional ('stderr' in __network_connections_result): True 18662 1726867324.74952: handler run complete 18662 1726867324.74965: attempt loop complete, returning result 18662 1726867324.74968: _execute() done 18662 1726867324.74970: dumping result to json 18662 1726867324.74973: done dumping result, returning 18662 1726867324.74981: done running TaskExecutor() for managed_node2/TASK: Assert that there is output in stderr [0affcac9-a3a5-efab-a8ce-000000000305] 18662 1726867324.74985: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000305 18662 1726867324.75073: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000305 18662 1726867324.75076: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 18662 1726867324.75154: no more pending results, returning what we have 18662 1726867324.75157: results queue empty 18662 1726867324.75158: checking for any_errors_fatal 18662 1726867324.75167: done checking for any_errors_fatal 18662 1726867324.75167: checking for max_fail_percentage 18662 1726867324.75169: done checking for max_fail_percentage 18662 1726867324.75170: checking to see if all hosts have failed and the running result is not ok 18662 1726867324.75171: done checking to see if all hosts have failed 18662 1726867324.75171: getting the remaining hosts for this loop 18662 1726867324.75173: done getting the remaining hosts for this loop 18662 1726867324.75176: getting the next task for host managed_node2 18662 1726867324.75189: done getting next task for host managed_node2 18662 1726867324.75191: ^ task is: TASK: meta (flush_handlers) 18662 1726867324.75193: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867324.75197: getting variables 18662 1726867324.75199: in VariableManager get_vars() 18662 1726867324.75401: Calling all_inventory to load vars for managed_node2 18662 1726867324.75404: Calling groups_inventory to load vars for managed_node2 18662 1726867324.75406: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867324.75415: Calling all_plugins_play to load vars for managed_node2 18662 1726867324.75418: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867324.75421: Calling groups_plugins_play to load vars for managed_node2 18662 1726867324.76701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867324.78820: done with get_vars() 18662 1726867324.78842: done getting variables 18662 1726867324.78920: in VariableManager get_vars() 18662 1726867324.78933: Calling all_inventory to load vars for managed_node2 18662 1726867324.78935: Calling groups_inventory to load vars for managed_node2 18662 1726867324.78937: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867324.78942: Calling all_plugins_play to load vars for managed_node2 18662 1726867324.78945: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867324.78947: Calling groups_plugins_play to load vars for managed_node2 18662 1726867324.80156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867324.81805: done with get_vars() 18662 1726867324.81833: done queuing things up, now waiting for results queue to drain 18662 1726867324.81839: results queue empty 18662 1726867324.81841: checking for any_errors_fatal 18662 1726867324.81843: done checking for any_errors_fatal 18662 1726867324.81844: checking for max_fail_percentage 18662 1726867324.81845: done checking for max_fail_percentage 18662 1726867324.81846: checking to see if all hosts have failed and the running result is not ok 18662 1726867324.81846: done checking to see if all hosts have failed 18662 1726867324.81847: getting the remaining hosts for this loop 18662 1726867324.81853: done getting the remaining hosts for this loop 18662 1726867324.81856: getting the next task for host managed_node2 18662 1726867324.81860: done getting next task for host managed_node2 18662 1726867324.81862: ^ task is: TASK: meta (flush_handlers) 18662 1726867324.81863: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867324.81866: getting variables 18662 1726867324.81867: in VariableManager get_vars() 18662 1726867324.81880: Calling all_inventory to load vars for managed_node2 18662 1726867324.81882: Calling groups_inventory to load vars for managed_node2 18662 1726867324.81884: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867324.81889: Calling all_plugins_play to load vars for managed_node2 18662 1726867324.81891: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867324.81894: Calling groups_plugins_play to load vars for managed_node2 18662 1726867324.83166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867324.86044: done with get_vars() 18662 1726867324.86069: done getting variables 18662 1726867324.86125: in VariableManager get_vars() 18662 1726867324.86138: Calling all_inventory to load vars for managed_node2 18662 1726867324.86254: Calling groups_inventory to load vars for managed_node2 18662 1726867324.86258: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867324.86263: Calling all_plugins_play to load vars for managed_node2 18662 1726867324.86266: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867324.86269: Calling groups_plugins_play to load vars for managed_node2 18662 1726867324.88159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867324.89822: done with get_vars() 18662 1726867324.89854: done queuing things up, now waiting for results queue to drain 18662 1726867324.89856: results queue empty 18662 1726867324.89857: checking for any_errors_fatal 18662 1726867324.89858: done checking for any_errors_fatal 18662 1726867324.89859: checking for max_fail_percentage 18662 1726867324.89860: done checking for max_fail_percentage 18662 1726867324.89860: checking to see if all hosts have failed and the running result is not ok 18662 1726867324.89861: done checking to see if all hosts have failed 18662 1726867324.89862: getting the remaining hosts for this loop 18662 1726867324.89863: done getting the remaining hosts for this loop 18662 1726867324.89866: getting the next task for host managed_node2 18662 1726867324.89869: done getting next task for host managed_node2 18662 1726867324.89870: ^ task is: None 18662 1726867324.89871: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867324.89872: done queuing things up, now waiting for results queue to drain 18662 1726867324.89873: results queue empty 18662 1726867324.89874: checking for any_errors_fatal 18662 1726867324.89875: done checking for any_errors_fatal 18662 1726867324.89875: checking for max_fail_percentage 18662 1726867324.89876: done checking for max_fail_percentage 18662 1726867324.89879: checking to see if all hosts have failed and the running result is not ok 18662 1726867324.89880: done checking to see if all hosts have failed 18662 1726867324.89881: getting the next task for host managed_node2 18662 1726867324.89883: done getting next task for host managed_node2 18662 1726867324.89884: ^ task is: None 18662 1726867324.89885: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867324.89925: in VariableManager get_vars() 18662 1726867324.89954: done with get_vars() 18662 1726867324.89962: in VariableManager get_vars() 18662 1726867324.89972: done with get_vars() 18662 1726867324.89976: variable 'omit' from source: magic vars 18662 1726867324.90011: in VariableManager get_vars() 18662 1726867324.90022: done with get_vars() 18662 1726867324.90043: variable 'omit' from source: magic vars PLAY [Play for cleaning up the test device and the connection profile] ********* 18662 1726867324.90225: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18662 1726867324.90246: getting the remaining hosts for this loop 18662 1726867324.90248: done getting the remaining hosts for this loop 18662 1726867324.90250: getting the next task for host managed_node2 18662 1726867324.90252: done getting next task for host managed_node2 18662 1726867324.90254: ^ task is: TASK: Gathering Facts 18662 1726867324.90255: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867324.90257: getting variables 18662 1726867324.90258: in VariableManager get_vars() 18662 1726867324.90269: Calling all_inventory to load vars for managed_node2 18662 1726867324.90271: Calling groups_inventory to load vars for managed_node2 18662 1726867324.90274: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867324.90284: Calling all_plugins_play to load vars for managed_node2 18662 1726867324.90287: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867324.90289: Calling groups_plugins_play to load vars for managed_node2 18662 1726867324.91509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867324.92990: done with get_vars() 18662 1726867324.93008: done getting variables 18662 1726867324.93047: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Friday 20 September 2024 17:22:04 -0400 (0:00:00.201) 0:00:19.566 ****** 18662 1726867324.93068: entering _queue_task() for managed_node2/gather_facts 18662 1726867324.93388: worker is 1 (out of 1 available) 18662 1726867324.93400: exiting _queue_task() for managed_node2/gather_facts 18662 1726867324.93410: done queuing things up, now waiting for results queue to drain 18662 1726867324.93411: waiting for pending results... 18662 1726867324.93693: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18662 1726867324.93788: in run() - task 0affcac9-a3a5-efab-a8ce-000000000316 18662 1726867324.93867: variable 'ansible_search_path' from source: unknown 18662 1726867324.93872: calling self._execute() 18662 1726867324.93962: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867324.93984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867324.93999: variable 'omit' from source: magic vars 18662 1726867324.94430: variable 'ansible_distribution_major_version' from source: facts 18662 1726867324.94449: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867324.94466: variable 'omit' from source: magic vars 18662 1726867324.94502: variable 'omit' from source: magic vars 18662 1726867324.94573: variable 'omit' from source: magic vars 18662 1726867324.94599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867324.94647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867324.94683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867324.94782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867324.94787: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867324.94791: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867324.94794: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867324.94797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867324.94917: Set connection var ansible_timeout to 10 18662 1726867324.94921: Set connection var ansible_connection to ssh 18662 1726867324.94923: Set connection var ansible_shell_executable to /bin/sh 18662 1726867324.94926: Set connection var ansible_shell_type to sh 18662 1726867324.94930: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867324.94932: Set connection var ansible_pipelining to False 18662 1726867324.95026: variable 'ansible_shell_executable' from source: unknown 18662 1726867324.95030: variable 'ansible_connection' from source: unknown 18662 1726867324.95032: variable 'ansible_module_compression' from source: unknown 18662 1726867324.95035: variable 'ansible_shell_type' from source: unknown 18662 1726867324.95037: variable 'ansible_shell_executable' from source: unknown 18662 1726867324.95039: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867324.95041: variable 'ansible_pipelining' from source: unknown 18662 1726867324.95042: variable 'ansible_timeout' from source: unknown 18662 1726867324.95044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867324.95200: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867324.95220: variable 'omit' from source: magic vars 18662 1726867324.95231: starting attempt loop 18662 1726867324.95247: running the handler 18662 1726867324.95271: variable 'ansible_facts' from source: unknown 18662 1726867324.95296: _low_level_execute_command(): starting 18662 1726867324.95353: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867324.96044: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867324.96057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867324.96070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867324.96089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867324.96125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867324.96141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867324.96234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867324.96258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867324.96347: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867324.98025: stdout chunk (state=3): >>>/root <<< 18662 1726867324.98183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867324.98186: stdout chunk (state=3): >>><<< 18662 1726867324.98189: stderr chunk (state=3): >>><<< 18662 1726867324.98310: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867324.98315: _low_level_execute_command(): starting 18662 1726867324.98318: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867324.9822025-19576-64487426744979 `" && echo ansible-tmp-1726867324.9822025-19576-64487426744979="` echo /root/.ansible/tmp/ansible-tmp-1726867324.9822025-19576-64487426744979 `" ) && sleep 0' 18662 1726867324.98903: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867324.98921: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867324.98936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867324.98958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867324.98999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867324.99104: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867324.99125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867324.99143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867324.99225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867325.01131: stdout chunk (state=3): >>>ansible-tmp-1726867324.9822025-19576-64487426744979=/root/.ansible/tmp/ansible-tmp-1726867324.9822025-19576-64487426744979 <<< 18662 1726867325.01356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867325.01366: stdout chunk (state=3): >>><<< 18662 1726867325.01392: stderr chunk (state=3): >>><<< 18662 1726867325.01583: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867324.9822025-19576-64487426744979=/root/.ansible/tmp/ansible-tmp-1726867324.9822025-19576-64487426744979 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867325.01587: variable 'ansible_module_compression' from source: unknown 18662 1726867325.01590: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18662 1726867325.01592: variable 'ansible_facts' from source: unknown 18662 1726867325.01727: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867324.9822025-19576-64487426744979/AnsiballZ_setup.py 18662 1726867325.01946: Sending initial data 18662 1726867325.01949: Sent initial data (153 bytes) 18662 1726867325.02484: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867325.02501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867325.02587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867325.02616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867325.02633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867325.02655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867325.02725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867325.04307: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867325.04406: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867325.04596: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpc9hme7a2 /root/.ansible/tmp/ansible-tmp-1726867324.9822025-19576-64487426744979/AnsiballZ_setup.py <<< 18662 1726867325.04619: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867324.9822025-19576-64487426744979/AnsiballZ_setup.py" <<< 18662 1726867325.04657: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpc9hme7a2" to remote "/root/.ansible/tmp/ansible-tmp-1726867324.9822025-19576-64487426744979/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867324.9822025-19576-64487426744979/AnsiballZ_setup.py" <<< 18662 1726867325.06186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867325.06201: stdout chunk (state=3): >>><<< 18662 1726867325.06220: stderr chunk (state=3): >>><<< 18662 1726867325.06330: done transferring module to remote 18662 1726867325.06333: _low_level_execute_command(): starting 18662 1726867325.06336: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867324.9822025-19576-64487426744979/ /root/.ansible/tmp/ansible-tmp-1726867324.9822025-19576-64487426744979/AnsiballZ_setup.py && sleep 0' 18662 1726867325.06851: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867325.06868: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867325.06888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867325.06905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867325.07003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867325.07033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867325.07097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867325.09019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867325.09028: stdout chunk (state=3): >>><<< 18662 1726867325.09158: stderr chunk (state=3): >>><<< 18662 1726867325.09181: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867325.09190: _low_level_execute_command(): starting 18662 1726867325.09193: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867324.9822025-19576-64487426744979/AnsiballZ_setup.py && sleep 0' 18662 1726867325.10385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 18662 1726867325.10389: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18662 1726867325.10393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867325.10395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867325.10403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867325.10418: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867325.10501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867325.10530: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867325.10548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867325.10693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867325.77844: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.39599609375, "5m": 0.38134765625, "15m": 0.2021484375}, "ansible_iscsi_iqn": "", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.<<< 18662 1726867325.77918: stdout chunk (state=3): >>>32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "22", "second": "05", "epoch": "1726867325", "epoch_int": "1726867325", "date": "2024-09-20", "time": "17:22:05", "iso8601_micro": "2024-09-20T21:22:05.390365Z", "iso8601": "2024-09-20T21:22:05Z", "iso8601_basic": "20240920T172205390365", "iso8601_basic_short": "20240920T172205", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["peerlsr27", "lo", "eth0", "lsr27"], "ansible_lsr27": {"device": "lsr27", "macaddress": "32:21:19:c5:50:d2", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::3021:19ff:fec5:50d2", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "e2:02:64:1b:da:9a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e002:64ff:fe1b:da9a", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.1", "10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::3021:19ff:fec5:50d2", "fe80::e002:64ff:fe1b:da9a", "fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad", "fe80::3021:19ff:fec5:50d2", "fe80::e002:64ff:fe1b:da9a"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 563, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794783232, "block_size": 4096, "block_total": 65519099, "block_available": 63914742, "block_used": 1604357, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18662 1726867325.80183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867325.80187: stdout chunk (state=3): >>><<< 18662 1726867325.80190: stderr chunk (state=3): >>><<< 18662 1726867325.80193: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.39599609375, "5m": 0.38134765625, "15m": 0.2021484375}, "ansible_iscsi_iqn": "", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "22", "second": "05", "epoch": "1726867325", "epoch_int": "1726867325", "date": "2024-09-20", "time": "17:22:05", "iso8601_micro": "2024-09-20T21:22:05.390365Z", "iso8601": "2024-09-20T21:22:05Z", "iso8601_basic": "20240920T172205390365", "iso8601_basic_short": "20240920T172205", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["peerlsr27", "lo", "eth0", "lsr27"], "ansible_lsr27": {"device": "lsr27", "macaddress": "32:21:19:c5:50:d2", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::3021:19ff:fec5:50d2", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "e2:02:64:1b:da:9a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e002:64ff:fe1b:da9a", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.1", "10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::3021:19ff:fec5:50d2", "fe80::e002:64ff:fe1b:da9a", "fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad", "fe80::3021:19ff:fec5:50d2", "fe80::e002:64ff:fe1b:da9a"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 563, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794783232, "block_size": 4096, "block_total": 65519099, "block_available": 63914742, "block_used": 1604357, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867325.81184: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867324.9822025-19576-64487426744979/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867325.81189: _low_level_execute_command(): starting 18662 1726867325.81192: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867324.9822025-19576-64487426744979/ > /dev/null 2>&1 && sleep 0' 18662 1726867325.82359: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867325.82402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867325.82566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867325.82745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867325.82814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867325.84824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867325.84833: stdout chunk (state=3): >>><<< 18662 1726867325.84842: stderr chunk (state=3): >>><<< 18662 1726867325.84862: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867325.84907: handler run complete 18662 1726867325.85165: variable 'ansible_facts' from source: unknown 18662 1726867325.85485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867325.86221: variable 'ansible_facts' from source: unknown 18662 1726867325.86695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867325.86774: attempt loop complete, returning result 18662 1726867325.86921: _execute() done 18662 1726867325.86930: dumping result to json 18662 1726867325.86972: done dumping result, returning 18662 1726867325.86987: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-efab-a8ce-000000000316] 18662 1726867325.87031: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000316 18662 1726867325.88416: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000316 18662 1726867325.88420: WORKER PROCESS EXITING ok: [managed_node2] 18662 1726867325.88814: no more pending results, returning what we have 18662 1726867325.88817: results queue empty 18662 1726867325.88818: checking for any_errors_fatal 18662 1726867325.88819: done checking for any_errors_fatal 18662 1726867325.88820: checking for max_fail_percentage 18662 1726867325.88822: done checking for max_fail_percentage 18662 1726867325.88823: checking to see if all hosts have failed and the running result is not ok 18662 1726867325.88823: done checking to see if all hosts have failed 18662 1726867325.88824: getting the remaining hosts for this loop 18662 1726867325.88825: done getting the remaining hosts for this loop 18662 1726867325.88828: getting the next task for host managed_node2 18662 1726867325.88833: done getting next task for host managed_node2 18662 1726867325.88835: ^ task is: TASK: meta (flush_handlers) 18662 1726867325.88837: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867325.88841: getting variables 18662 1726867325.88842: in VariableManager get_vars() 18662 1726867325.88865: Calling all_inventory to load vars for managed_node2 18662 1726867325.88867: Calling groups_inventory to load vars for managed_node2 18662 1726867325.88870: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867325.89186: Calling all_plugins_play to load vars for managed_node2 18662 1726867325.89189: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867325.89194: Calling groups_plugins_play to load vars for managed_node2 18662 1726867325.92100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867325.96475: done with get_vars() 18662 1726867325.96863: done getting variables 18662 1726867325.96930: in VariableManager get_vars() 18662 1726867325.96942: Calling all_inventory to load vars for managed_node2 18662 1726867325.96944: Calling groups_inventory to load vars for managed_node2 18662 1726867325.96947: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867325.96952: Calling all_plugins_play to load vars for managed_node2 18662 1726867325.96955: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867325.96958: Calling groups_plugins_play to load vars for managed_node2 18662 1726867325.99806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867326.02847: done with get_vars() 18662 1726867326.03080: done queuing things up, now waiting for results queue to drain 18662 1726867326.03083: results queue empty 18662 1726867326.03084: checking for any_errors_fatal 18662 1726867326.03088: done checking for any_errors_fatal 18662 1726867326.03089: checking for max_fail_percentage 18662 1726867326.03091: done checking for max_fail_percentage 18662 1726867326.03091: checking to see if all hosts have failed and the running result is not ok 18662 1726867326.03092: done checking to see if all hosts have failed 18662 1726867326.03097: getting the remaining hosts for this loop 18662 1726867326.03098: done getting the remaining hosts for this loop 18662 1726867326.03100: getting the next task for host managed_node2 18662 1726867326.03104: done getting next task for host managed_node2 18662 1726867326.03106: ^ task is: TASK: Show network_provider 18662 1726867326.03108: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867326.03110: getting variables 18662 1726867326.03110: in VariableManager get_vars() 18662 1726867326.03119: Calling all_inventory to load vars for managed_node2 18662 1726867326.03121: Calling groups_inventory to load vars for managed_node2 18662 1726867326.03123: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867326.03128: Calling all_plugins_play to load vars for managed_node2 18662 1726867326.03130: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867326.03132: Calling groups_plugins_play to load vars for managed_node2 18662 1726867326.05568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867326.08456: done with get_vars() 18662 1726867326.08478: done getting variables 18662 1726867326.08722: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:53 Friday 20 September 2024 17:22:06 -0400 (0:00:01.156) 0:00:20.722 ****** 18662 1726867326.08745: entering _queue_task() for managed_node2/debug 18662 1726867326.09710: worker is 1 (out of 1 available) 18662 1726867326.09718: exiting _queue_task() for managed_node2/debug 18662 1726867326.09728: done queuing things up, now waiting for results queue to drain 18662 1726867326.09729: waiting for pending results... 18662 1726867326.09968: running TaskExecutor() for managed_node2/TASK: Show network_provider 18662 1726867326.10065: in run() - task 0affcac9-a3a5-efab-a8ce-000000000033 18662 1726867326.10192: variable 'ansible_search_path' from source: unknown 18662 1726867326.10283: calling self._execute() 18662 1726867326.10585: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867326.10590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867326.10592: variable 'omit' from source: magic vars 18662 1726867326.11332: variable 'ansible_distribution_major_version' from source: facts 18662 1726867326.11392: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867326.11404: variable 'omit' from source: magic vars 18662 1726867326.11436: variable 'omit' from source: magic vars 18662 1726867326.11491: variable 'omit' from source: magic vars 18662 1726867326.11604: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867326.11711: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867326.11740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867326.11803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867326.11820: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867326.11922: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867326.11933: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867326.11942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867326.12154: Set connection var ansible_timeout to 10 18662 1726867326.12162: Set connection var ansible_connection to ssh 18662 1726867326.12172: Set connection var ansible_shell_executable to /bin/sh 18662 1726867326.12180: Set connection var ansible_shell_type to sh 18662 1726867326.12195: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867326.12224: Set connection var ansible_pipelining to False 18662 1726867326.12253: variable 'ansible_shell_executable' from source: unknown 18662 1726867326.12546: variable 'ansible_connection' from source: unknown 18662 1726867326.12549: variable 'ansible_module_compression' from source: unknown 18662 1726867326.12552: variable 'ansible_shell_type' from source: unknown 18662 1726867326.12554: variable 'ansible_shell_executable' from source: unknown 18662 1726867326.12556: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867326.12558: variable 'ansible_pipelining' from source: unknown 18662 1726867326.12560: variable 'ansible_timeout' from source: unknown 18662 1726867326.12562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867326.12620: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867326.12665: variable 'omit' from source: magic vars 18662 1726867326.12675: starting attempt loop 18662 1726867326.12872: running the handler 18662 1726867326.12875: variable 'network_provider' from source: set_fact 18662 1726867326.13004: variable 'network_provider' from source: set_fact 18662 1726867326.13020: handler run complete 18662 1726867326.13042: attempt loop complete, returning result 18662 1726867326.13050: _execute() done 18662 1726867326.13057: dumping result to json 18662 1726867326.13064: done dumping result, returning 18662 1726867326.13076: done running TaskExecutor() for managed_node2/TASK: Show network_provider [0affcac9-a3a5-efab-a8ce-000000000033] 18662 1726867326.13097: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000033 18662 1726867326.13445: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000033 18662 1726867326.13448: WORKER PROCESS EXITING ok: [managed_node2] => { "network_provider": "nm" } 18662 1726867326.13535: no more pending results, returning what we have 18662 1726867326.13549: results queue empty 18662 1726867326.13550: checking for any_errors_fatal 18662 1726867326.13553: done checking for any_errors_fatal 18662 1726867326.13553: checking for max_fail_percentage 18662 1726867326.13555: done checking for max_fail_percentage 18662 1726867326.13556: checking to see if all hosts have failed and the running result is not ok 18662 1726867326.13557: done checking to see if all hosts have failed 18662 1726867326.13558: getting the remaining hosts for this loop 18662 1726867326.13559: done getting the remaining hosts for this loop 18662 1726867326.13565: getting the next task for host managed_node2 18662 1726867326.13572: done getting next task for host managed_node2 18662 1726867326.13574: ^ task is: TASK: meta (flush_handlers) 18662 1726867326.13576: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867326.13583: getting variables 18662 1726867326.13585: in VariableManager get_vars() 18662 1726867326.13613: Calling all_inventory to load vars for managed_node2 18662 1726867326.13616: Calling groups_inventory to load vars for managed_node2 18662 1726867326.13619: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867326.13630: Calling all_plugins_play to load vars for managed_node2 18662 1726867326.13633: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867326.13635: Calling groups_plugins_play to load vars for managed_node2 18662 1726867326.17696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867326.22504: done with get_vars() 18662 1726867326.22532: done getting variables 18662 1726867326.22814: in VariableManager get_vars() 18662 1726867326.22824: Calling all_inventory to load vars for managed_node2 18662 1726867326.22827: Calling groups_inventory to load vars for managed_node2 18662 1726867326.22829: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867326.22834: Calling all_plugins_play to load vars for managed_node2 18662 1726867326.22836: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867326.22839: Calling groups_plugins_play to load vars for managed_node2 18662 1726867326.25301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867326.28874: done with get_vars() 18662 1726867326.28975: done queuing things up, now waiting for results queue to drain 18662 1726867326.28979: results queue empty 18662 1726867326.28980: checking for any_errors_fatal 18662 1726867326.28983: done checking for any_errors_fatal 18662 1726867326.28984: checking for max_fail_percentage 18662 1726867326.28985: done checking for max_fail_percentage 18662 1726867326.28986: checking to see if all hosts have failed and the running result is not ok 18662 1726867326.28987: done checking to see if all hosts have failed 18662 1726867326.28987: getting the remaining hosts for this loop 18662 1726867326.28988: done getting the remaining hosts for this loop 18662 1726867326.28991: getting the next task for host managed_node2 18662 1726867326.29000: done getting next task for host managed_node2 18662 1726867326.29002: ^ task is: TASK: meta (flush_handlers) 18662 1726867326.29003: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867326.29006: getting variables 18662 1726867326.29007: in VariableManager get_vars() 18662 1726867326.29017: Calling all_inventory to load vars for managed_node2 18662 1726867326.29020: Calling groups_inventory to load vars for managed_node2 18662 1726867326.29022: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867326.29027: Calling all_plugins_play to load vars for managed_node2 18662 1726867326.29030: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867326.29033: Calling groups_plugins_play to load vars for managed_node2 18662 1726867326.30897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867326.33361: done with get_vars() 18662 1726867326.33383: done getting variables 18662 1726867326.33532: in VariableManager get_vars() 18662 1726867326.33541: Calling all_inventory to load vars for managed_node2 18662 1726867326.33543: Calling groups_inventory to load vars for managed_node2 18662 1726867326.33545: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867326.33549: Calling all_plugins_play to load vars for managed_node2 18662 1726867326.33551: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867326.33554: Calling groups_plugins_play to load vars for managed_node2 18662 1726867326.34732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867326.37363: done with get_vars() 18662 1726867326.37391: done queuing things up, now waiting for results queue to drain 18662 1726867326.37393: results queue empty 18662 1726867326.37394: checking for any_errors_fatal 18662 1726867326.37395: done checking for any_errors_fatal 18662 1726867326.37396: checking for max_fail_percentage 18662 1726867326.37397: done checking for max_fail_percentage 18662 1726867326.37397: checking to see if all hosts have failed and the running result is not ok 18662 1726867326.37398: done checking to see if all hosts have failed 18662 1726867326.37399: getting the remaining hosts for this loop 18662 1726867326.37400: done getting the remaining hosts for this loop 18662 1726867326.37402: getting the next task for host managed_node2 18662 1726867326.37405: done getting next task for host managed_node2 18662 1726867326.37406: ^ task is: None 18662 1726867326.37408: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867326.37409: done queuing things up, now waiting for results queue to drain 18662 1726867326.37409: results queue empty 18662 1726867326.37410: checking for any_errors_fatal 18662 1726867326.37411: done checking for any_errors_fatal 18662 1726867326.37411: checking for max_fail_percentage 18662 1726867326.37412: done checking for max_fail_percentage 18662 1726867326.37413: checking to see if all hosts have failed and the running result is not ok 18662 1726867326.37414: done checking to see if all hosts have failed 18662 1726867326.37415: getting the next task for host managed_node2 18662 1726867326.37417: done getting next task for host managed_node2 18662 1726867326.37418: ^ task is: None 18662 1726867326.37419: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867326.37585: in VariableManager get_vars() 18662 1726867326.37605: done with get_vars() 18662 1726867326.37610: in VariableManager get_vars() 18662 1726867326.37621: done with get_vars() 18662 1726867326.37625: variable 'omit' from source: magic vars 18662 1726867326.37849: variable 'profile' from source: play vars 18662 1726867326.38140: in VariableManager get_vars() 18662 1726867326.38156: done with get_vars() 18662 1726867326.38181: variable 'omit' from source: magic vars 18662 1726867326.38692: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 18662 1726867326.40581: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18662 1726867326.40926: getting the remaining hosts for this loop 18662 1726867326.40927: done getting the remaining hosts for this loop 18662 1726867326.40930: getting the next task for host managed_node2 18662 1726867326.40933: done getting next task for host managed_node2 18662 1726867326.40935: ^ task is: TASK: Gathering Facts 18662 1726867326.40936: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867326.40938: getting variables 18662 1726867326.40939: in VariableManager get_vars() 18662 1726867326.40950: Calling all_inventory to load vars for managed_node2 18662 1726867326.40952: Calling groups_inventory to load vars for managed_node2 18662 1726867326.40954: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867326.40959: Calling all_plugins_play to load vars for managed_node2 18662 1726867326.40961: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867326.40963: Calling groups_plugins_play to load vars for managed_node2 18662 1726867326.44764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867326.49089: done with get_vars() 18662 1726867326.49117: done getting variables 18662 1726867326.49284: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 17:22:06 -0400 (0:00:00.405) 0:00:21.128 ****** 18662 1726867326.49311: entering _queue_task() for managed_node2/gather_facts 18662 1726867326.50182: worker is 1 (out of 1 available) 18662 1726867326.50194: exiting _queue_task() for managed_node2/gather_facts 18662 1726867326.50205: done queuing things up, now waiting for results queue to drain 18662 1726867326.50207: waiting for pending results... 18662 1726867326.50597: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18662 1726867326.50749: in run() - task 0affcac9-a3a5-efab-a8ce-00000000032b 18662 1726867326.50918: variable 'ansible_search_path' from source: unknown 18662 1726867326.50922: calling self._execute() 18662 1726867326.51244: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867326.51247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867326.51251: variable 'omit' from source: magic vars 18662 1726867326.52010: variable 'ansible_distribution_major_version' from source: facts 18662 1726867326.52014: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867326.52017: variable 'omit' from source: magic vars 18662 1726867326.52020: variable 'omit' from source: magic vars 18662 1726867326.52340: variable 'omit' from source: magic vars 18662 1726867326.52344: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867326.52348: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867326.52350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867326.52352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867326.52354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867326.52559: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867326.52562: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867326.52565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867326.52698: Set connection var ansible_timeout to 10 18662 1726867326.52707: Set connection var ansible_connection to ssh 18662 1726867326.52719: Set connection var ansible_shell_executable to /bin/sh 18662 1726867326.52727: Set connection var ansible_shell_type to sh 18662 1726867326.52742: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867326.52789: Set connection var ansible_pipelining to False 18662 1726867326.52821: variable 'ansible_shell_executable' from source: unknown 18662 1726867326.52905: variable 'ansible_connection' from source: unknown 18662 1726867326.52908: variable 'ansible_module_compression' from source: unknown 18662 1726867326.52910: variable 'ansible_shell_type' from source: unknown 18662 1726867326.52913: variable 'ansible_shell_executable' from source: unknown 18662 1726867326.52915: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867326.52993: variable 'ansible_pipelining' from source: unknown 18662 1726867326.52996: variable 'ansible_timeout' from source: unknown 18662 1726867326.52998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867326.53288: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867326.53305: variable 'omit' from source: magic vars 18662 1726867326.53349: starting attempt loop 18662 1726867326.53356: running the handler 18662 1726867326.53379: variable 'ansible_facts' from source: unknown 18662 1726867326.53669: _low_level_execute_command(): starting 18662 1726867326.53672: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867326.54886: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867326.55080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867326.55229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867326.55274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867326.56970: stdout chunk (state=3): >>>/root <<< 18662 1726867326.57237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867326.57240: stdout chunk (state=3): >>><<< 18662 1726867326.57243: stderr chunk (state=3): >>><<< 18662 1726867326.57270: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867326.57466: _low_level_execute_command(): starting 18662 1726867326.57470: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867326.573734-19653-260793079077948 `" && echo ansible-tmp-1726867326.573734-19653-260793079077948="` echo /root/.ansible/tmp/ansible-tmp-1726867326.573734-19653-260793079077948 `" ) && sleep 0' 18662 1726867326.58668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867326.58775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867326.58808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867326.58846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867326.60849: stdout chunk (state=3): >>>ansible-tmp-1726867326.573734-19653-260793079077948=/root/.ansible/tmp/ansible-tmp-1726867326.573734-19653-260793079077948 <<< 18662 1726867326.60994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867326.61004: stdout chunk (state=3): >>><<< 18662 1726867326.61021: stderr chunk (state=3): >>><<< 18662 1726867326.61045: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867326.573734-19653-260793079077948=/root/.ansible/tmp/ansible-tmp-1726867326.573734-19653-260793079077948 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867326.61088: variable 'ansible_module_compression' from source: unknown 18662 1726867326.61238: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18662 1726867326.61341: variable 'ansible_facts' from source: unknown 18662 1726867326.61863: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867326.573734-19653-260793079077948/AnsiballZ_setup.py 18662 1726867326.62216: Sending initial data 18662 1726867326.62219: Sent initial data (153 bytes) 18662 1726867326.63296: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867326.63316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867326.63329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867326.63339: stderr chunk (state=3): >>>debug2: match found <<< 18662 1726867326.63353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867326.63568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867326.63604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867326.65407: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867326.65419: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867326.65489: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmp6xhzaino /root/.ansible/tmp/ansible-tmp-1726867326.573734-19653-260793079077948/AnsiballZ_setup.py <<< 18662 1726867326.65493: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867326.573734-19653-260793079077948/AnsiballZ_setup.py" <<< 18662 1726867326.65757: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmp6xhzaino" to remote "/root/.ansible/tmp/ansible-tmp-1726867326.573734-19653-260793079077948/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867326.573734-19653-260793079077948/AnsiballZ_setup.py" <<< 18662 1726867326.68573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867326.68609: stdout chunk (state=3): >>><<< 18662 1726867326.68792: stderr chunk (state=3): >>><<< 18662 1726867326.68796: done transferring module to remote 18662 1726867326.68798: _low_level_execute_command(): starting 18662 1726867326.68801: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867326.573734-19653-260793079077948/ /root/.ansible/tmp/ansible-tmp-1726867326.573734-19653-260793079077948/AnsiballZ_setup.py && sleep 0' 18662 1726867326.70167: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867326.70281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867326.70304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867326.70379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867326.72297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867326.72330: stderr chunk (state=3): >>><<< 18662 1726867326.72339: stdout chunk (state=3): >>><<< 18662 1726867326.72363: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867326.72379: _low_level_execute_command(): starting 18662 1726867326.72484: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867326.573734-19653-260793079077948/AnsiballZ_setup.py && sleep 0' 18662 1726867326.73608: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867326.73646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867326.73663: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18662 1726867326.73865: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867326.73887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867326.73918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867326.73966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867327.39329: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_local": {}, "ansible_loadavg": {"1m": 0.44482421875, "5m": 0.39208984375, "15m": 0.20654296875}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "22", "second": "07", "epoch": "1726867327", "epoch_int": "1726867327", "date": "2024-09-20", "time": "17:22:07", "iso8601_micro": "2024-09-20T21:22:07.019517Z", "iso8601": "2024-09-20T21:22:07Z", "iso8601_basic": "20240920T172207019517", "iso8601_basic_short": "20240920T172207", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "lo", "peerlsr27", "lsr27"], "ansible_lsr27": {"device": "lsr27", "macaddress": "32:21:19:c5:50:d2", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::3021:19ff:fec5:50d2", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "e2:02:64:1b:da:9a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e002:64ff:fe1b:da9a", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_<<< 18662 1726867327.39358: stdout chunk (state=3): >>>hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.1", "10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::3021:19ff:fec5:50d2", "fe80::e002:64ff:fe1b:da9a", "fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad", "fe80::3021:19ff:fec5:50d2", "fe80::e002:64ff:fe1b:da9a"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3294, "used": 237}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 565, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794783232, "block_size": 4096, "block_total": 65519099, "block_available": 63914742, "block_used": 1604357, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18662 1726867327.41527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867327.41539: stdout chunk (state=3): >>><<< 18662 1726867327.41553: stderr chunk (state=3): >>><<< 18662 1726867327.41897: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_local": {}, "ansible_loadavg": {"1m": 0.44482421875, "5m": 0.39208984375, "15m": 0.20654296875}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "22", "second": "07", "epoch": "1726867327", "epoch_int": "1726867327", "date": "2024-09-20", "time": "17:22:07", "iso8601_micro": "2024-09-20T21:22:07.019517Z", "iso8601": "2024-09-20T21:22:07Z", "iso8601_basic": "20240920T172207019517", "iso8601_basic_short": "20240920T172207", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["eth0", "lo", "peerlsr27", "lsr27"], "ansible_lsr27": {"device": "lsr27", "macaddress": "32:21:19:c5:50:d2", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::3021:19ff:fec5:50d2", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "e2:02:64:1b:da:9a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e002:64ff:fe1b:da9a", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["192.0.2.1", "10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::3021:19ff:fec5:50d2", "fe80::e002:64ff:fe1b:da9a", "fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad", "fe80::3021:19ff:fec5:50d2", "fe80::e002:64ff:fe1b:da9a"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3294, "used": 237}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 565, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794783232, "block_size": 4096, "block_total": 65519099, "block_available": 63914742, "block_used": 1604357, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867327.42838: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867326.573734-19653-260793079077948/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867327.43100: _low_level_execute_command(): starting 18662 1726867327.43104: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867326.573734-19653-260793079077948/ > /dev/null 2>&1 && sleep 0' 18662 1726867327.44343: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867327.44347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867327.44349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867327.44351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 18662 1726867327.44353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867327.44618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867327.44634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867327.44639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867327.44741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867327.46615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867327.46619: stdout chunk (state=3): >>><<< 18662 1726867327.46621: stderr chunk (state=3): >>><<< 18662 1726867327.46985: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867327.46989: handler run complete 18662 1726867327.46992: variable 'ansible_facts' from source: unknown 18662 1726867327.47365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867327.47745: variable 'ansible_facts' from source: unknown 18662 1726867327.47866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867327.48020: attempt loop complete, returning result 18662 1726867327.48031: _execute() done 18662 1726867327.48037: dumping result to json 18662 1726867327.48088: done dumping result, returning 18662 1726867327.48101: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-efab-a8ce-00000000032b] 18662 1726867327.48111: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000032b ok: [managed_node2] 18662 1726867327.48984: no more pending results, returning what we have 18662 1726867327.49100: results queue empty 18662 1726867327.49101: checking for any_errors_fatal 18662 1726867327.49103: done checking for any_errors_fatal 18662 1726867327.49104: checking for max_fail_percentage 18662 1726867327.49105: done checking for max_fail_percentage 18662 1726867327.49106: checking to see if all hosts have failed and the running result is not ok 18662 1726867327.49107: done checking to see if all hosts have failed 18662 1726867327.49111: getting the remaining hosts for this loop 18662 1726867327.49112: done getting the remaining hosts for this loop 18662 1726867327.49115: getting the next task for host managed_node2 18662 1726867327.49121: done getting next task for host managed_node2 18662 1726867327.49122: ^ task is: TASK: meta (flush_handlers) 18662 1726867327.49124: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867327.49128: getting variables 18662 1726867327.49129: in VariableManager get_vars() 18662 1726867327.49158: Calling all_inventory to load vars for managed_node2 18662 1726867327.49161: Calling groups_inventory to load vars for managed_node2 18662 1726867327.49163: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867327.49291: Calling all_plugins_play to load vars for managed_node2 18662 1726867327.49295: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867327.49299: Calling groups_plugins_play to load vars for managed_node2 18662 1726867327.49827: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000032b 18662 1726867327.49830: WORKER PROCESS EXITING 18662 1726867327.51231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867327.54872: done with get_vars() 18662 1726867327.55000: done getting variables 18662 1726867327.55183: in VariableManager get_vars() 18662 1726867327.55195: Calling all_inventory to load vars for managed_node2 18662 1726867327.55197: Calling groups_inventory to load vars for managed_node2 18662 1726867327.55199: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867327.55204: Calling all_plugins_play to load vars for managed_node2 18662 1726867327.55206: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867327.55210: Calling groups_plugins_play to load vars for managed_node2 18662 1726867327.56921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867327.58807: done with get_vars() 18662 1726867327.58842: done queuing things up, now waiting for results queue to drain 18662 1726867327.58844: results queue empty 18662 1726867327.58845: checking for any_errors_fatal 18662 1726867327.58855: done checking for any_errors_fatal 18662 1726867327.58860: checking for max_fail_percentage 18662 1726867327.58862: done checking for max_fail_percentage 18662 1726867327.58863: checking to see if all hosts have failed and the running result is not ok 18662 1726867327.58863: done checking to see if all hosts have failed 18662 1726867327.58864: getting the remaining hosts for this loop 18662 1726867327.58865: done getting the remaining hosts for this loop 18662 1726867327.58868: getting the next task for host managed_node2 18662 1726867327.58872: done getting next task for host managed_node2 18662 1726867327.58876: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18662 1726867327.58879: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867327.58888: getting variables 18662 1726867327.58890: in VariableManager get_vars() 18662 1726867327.58912: Calling all_inventory to load vars for managed_node2 18662 1726867327.58915: Calling groups_inventory to load vars for managed_node2 18662 1726867327.58917: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867327.58922: Calling all_plugins_play to load vars for managed_node2 18662 1726867327.58925: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867327.58928: Calling groups_plugins_play to load vars for managed_node2 18662 1726867327.60299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867327.61964: done with get_vars() 18662 1726867327.61985: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:22:07 -0400 (0:00:01.127) 0:00:22.256 ****** 18662 1726867327.62072: entering _queue_task() for managed_node2/include_tasks 18662 1726867327.62512: worker is 1 (out of 1 available) 18662 1726867327.62524: exiting _queue_task() for managed_node2/include_tasks 18662 1726867327.62537: done queuing things up, now waiting for results queue to drain 18662 1726867327.62538: waiting for pending results... 18662 1726867327.62835: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18662 1726867327.62968: in run() - task 0affcac9-a3a5-efab-a8ce-00000000003c 18662 1726867327.63018: variable 'ansible_search_path' from source: unknown 18662 1726867327.63021: variable 'ansible_search_path' from source: unknown 18662 1726867327.63055: calling self._execute() 18662 1726867327.63172: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867327.63237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867327.63244: variable 'omit' from source: magic vars 18662 1726867327.63697: variable 'ansible_distribution_major_version' from source: facts 18662 1726867327.63719: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867327.63731: _execute() done 18662 1726867327.63772: dumping result to json 18662 1726867327.63776: done dumping result, returning 18662 1726867327.63859: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-efab-a8ce-00000000003c] 18662 1726867327.63861: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000003c 18662 1726867327.64353: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000003c 18662 1726867327.64356: WORKER PROCESS EXITING 18662 1726867327.64422: no more pending results, returning what we have 18662 1726867327.64426: in VariableManager get_vars() 18662 1726867327.64463: Calling all_inventory to load vars for managed_node2 18662 1726867327.64465: Calling groups_inventory to load vars for managed_node2 18662 1726867327.64467: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867327.64478: Calling all_plugins_play to load vars for managed_node2 18662 1726867327.64481: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867327.64484: Calling groups_plugins_play to load vars for managed_node2 18662 1726867327.66802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867327.68732: done with get_vars() 18662 1726867327.68751: variable 'ansible_search_path' from source: unknown 18662 1726867327.68753: variable 'ansible_search_path' from source: unknown 18662 1726867327.68784: we have included files to process 18662 1726867327.68785: generating all_blocks data 18662 1726867327.68786: done generating all_blocks data 18662 1726867327.68787: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18662 1726867327.68788: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18662 1726867327.68791: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18662 1726867327.69411: done processing included file 18662 1726867327.69413: iterating over new_blocks loaded from include file 18662 1726867327.69415: in VariableManager get_vars() 18662 1726867327.69434: done with get_vars() 18662 1726867327.69436: filtering new block on tags 18662 1726867327.69450: done filtering new block on tags 18662 1726867327.69453: in VariableManager get_vars() 18662 1726867327.69471: done with get_vars() 18662 1726867327.69472: filtering new block on tags 18662 1726867327.69501: done filtering new block on tags 18662 1726867327.69504: in VariableManager get_vars() 18662 1726867327.69526: done with get_vars() 18662 1726867327.69528: filtering new block on tags 18662 1726867327.69543: done filtering new block on tags 18662 1726867327.69545: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 18662 1726867327.69550: extending task lists for all hosts with included blocks 18662 1726867327.69953: done extending task lists 18662 1726867327.69955: done processing included files 18662 1726867327.69955: results queue empty 18662 1726867327.69956: checking for any_errors_fatal 18662 1726867327.69958: done checking for any_errors_fatal 18662 1726867327.69958: checking for max_fail_percentage 18662 1726867327.69959: done checking for max_fail_percentage 18662 1726867327.69960: checking to see if all hosts have failed and the running result is not ok 18662 1726867327.69961: done checking to see if all hosts have failed 18662 1726867327.69962: getting the remaining hosts for this loop 18662 1726867327.69963: done getting the remaining hosts for this loop 18662 1726867327.69965: getting the next task for host managed_node2 18662 1726867327.69969: done getting next task for host managed_node2 18662 1726867327.69971: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18662 1726867327.69974: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867327.69985: getting variables 18662 1726867327.69986: in VariableManager get_vars() 18662 1726867327.69999: Calling all_inventory to load vars for managed_node2 18662 1726867327.70001: Calling groups_inventory to load vars for managed_node2 18662 1726867327.70003: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867327.70011: Calling all_plugins_play to load vars for managed_node2 18662 1726867327.70014: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867327.70017: Calling groups_plugins_play to load vars for managed_node2 18662 1726867327.76576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867327.78325: done with get_vars() 18662 1726867327.78347: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:22:07 -0400 (0:00:00.163) 0:00:22.419 ****** 18662 1726867327.78432: entering _queue_task() for managed_node2/setup 18662 1726867327.78810: worker is 1 (out of 1 available) 18662 1726867327.78822: exiting _queue_task() for managed_node2/setup 18662 1726867327.78833: done queuing things up, now waiting for results queue to drain 18662 1726867327.78834: waiting for pending results... 18662 1726867327.79144: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18662 1726867327.79304: in run() - task 0affcac9-a3a5-efab-a8ce-00000000036c 18662 1726867327.79325: variable 'ansible_search_path' from source: unknown 18662 1726867327.79332: variable 'ansible_search_path' from source: unknown 18662 1726867327.79367: calling self._execute() 18662 1726867327.79475: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867327.79501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867327.79519: variable 'omit' from source: magic vars 18662 1726867327.79941: variable 'ansible_distribution_major_version' from source: facts 18662 1726867327.79958: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867327.80194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867327.82501: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867327.82688: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867327.82744: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867327.82841: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867327.82975: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867327.83020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867327.83056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867327.83092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867327.83160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867327.83180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867327.83288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867327.83403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867327.83441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867327.83545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867327.83549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867327.83902: variable '__network_required_facts' from source: role '' defaults 18662 1726867327.83923: variable 'ansible_facts' from source: unknown 18662 1726867327.85794: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 18662 1726867327.85807: when evaluation is False, skipping this task 18662 1726867327.85851: _execute() done 18662 1726867327.85855: dumping result to json 18662 1726867327.85858: done dumping result, returning 18662 1726867327.85861: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-efab-a8ce-00000000036c] 18662 1726867327.85863: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000036c 18662 1726867327.86285: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000036c 18662 1726867327.86289: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18662 1726867327.86334: no more pending results, returning what we have 18662 1726867327.86339: results queue empty 18662 1726867327.86340: checking for any_errors_fatal 18662 1726867327.86341: done checking for any_errors_fatal 18662 1726867327.86342: checking for max_fail_percentage 18662 1726867327.86344: done checking for max_fail_percentage 18662 1726867327.86345: checking to see if all hosts have failed and the running result is not ok 18662 1726867327.86345: done checking to see if all hosts have failed 18662 1726867327.86346: getting the remaining hosts for this loop 18662 1726867327.86348: done getting the remaining hosts for this loop 18662 1726867327.86352: getting the next task for host managed_node2 18662 1726867327.86362: done getting next task for host managed_node2 18662 1726867327.86367: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 18662 1726867327.86369: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867327.86385: getting variables 18662 1726867327.86391: in VariableManager get_vars() 18662 1726867327.86430: Calling all_inventory to load vars for managed_node2 18662 1726867327.86433: Calling groups_inventory to load vars for managed_node2 18662 1726867327.86435: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867327.86446: Calling all_plugins_play to load vars for managed_node2 18662 1726867327.86449: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867327.86452: Calling groups_plugins_play to load vars for managed_node2 18662 1726867327.89958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867327.92172: done with get_vars() 18662 1726867327.92197: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:22:07 -0400 (0:00:00.139) 0:00:22.559 ****** 18662 1726867327.92410: entering _queue_task() for managed_node2/stat 18662 1726867327.93143: worker is 1 (out of 1 available) 18662 1726867327.93152: exiting _queue_task() for managed_node2/stat 18662 1726867327.93162: done queuing things up, now waiting for results queue to drain 18662 1726867327.93163: waiting for pending results... 18662 1726867327.93614: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 18662 1726867327.94187: in run() - task 0affcac9-a3a5-efab-a8ce-00000000036e 18662 1726867327.94191: variable 'ansible_search_path' from source: unknown 18662 1726867327.94193: variable 'ansible_search_path' from source: unknown 18662 1726867327.94196: calling self._execute() 18662 1726867327.94198: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867327.94629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867327.94632: variable 'omit' from source: magic vars 18662 1726867327.95488: variable 'ansible_distribution_major_version' from source: facts 18662 1726867327.95579: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867327.95976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867327.96584: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867327.96617: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867327.96776: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867327.96935: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867327.97015: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867327.97062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867327.97172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867327.97205: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867327.97476: variable '__network_is_ostree' from source: set_fact 18662 1726867327.97492: Evaluated conditional (not __network_is_ostree is defined): False 18662 1726867327.97499: when evaluation is False, skipping this task 18662 1726867327.97506: _execute() done 18662 1726867327.97513: dumping result to json 18662 1726867327.97520: done dumping result, returning 18662 1726867327.97530: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-efab-a8ce-00000000036e] 18662 1726867327.97539: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000036e skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18662 1726867327.97728: no more pending results, returning what we have 18662 1726867327.97732: results queue empty 18662 1726867327.97733: checking for any_errors_fatal 18662 1726867327.97739: done checking for any_errors_fatal 18662 1726867327.97739: checking for max_fail_percentage 18662 1726867327.97742: done checking for max_fail_percentage 18662 1726867327.97743: checking to see if all hosts have failed and the running result is not ok 18662 1726867327.97743: done checking to see if all hosts have failed 18662 1726867327.97744: getting the remaining hosts for this loop 18662 1726867327.97746: done getting the remaining hosts for this loop 18662 1726867327.97750: getting the next task for host managed_node2 18662 1726867327.97758: done getting next task for host managed_node2 18662 1726867327.97762: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18662 1726867327.97765: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867327.97782: getting variables 18662 1726867327.97784: in VariableManager get_vars() 18662 1726867327.98016: Calling all_inventory to load vars for managed_node2 18662 1726867327.98019: Calling groups_inventory to load vars for managed_node2 18662 1726867327.98021: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867327.98033: Calling all_plugins_play to load vars for managed_node2 18662 1726867327.98036: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867327.98039: Calling groups_plugins_play to load vars for managed_node2 18662 1726867327.98973: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000036e 18662 1726867327.98976: WORKER PROCESS EXITING 18662 1726867328.01921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867328.05237: done with get_vars() 18662 1726867328.05261: done getting variables 18662 1726867328.05430: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:22:08 -0400 (0:00:00.130) 0:00:22.690 ****** 18662 1726867328.05465: entering _queue_task() for managed_node2/set_fact 18662 1726867328.06489: worker is 1 (out of 1 available) 18662 1726867328.06499: exiting _queue_task() for managed_node2/set_fact 18662 1726867328.06509: done queuing things up, now waiting for results queue to drain 18662 1726867328.06510: waiting for pending results... 18662 1726867328.06780: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18662 1726867328.07055: in run() - task 0affcac9-a3a5-efab-a8ce-00000000036f 18662 1726867328.07069: variable 'ansible_search_path' from source: unknown 18662 1726867328.07072: variable 'ansible_search_path' from source: unknown 18662 1726867328.07111: calling self._execute() 18662 1726867328.07463: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867328.07467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867328.07470: variable 'omit' from source: magic vars 18662 1726867328.08342: variable 'ansible_distribution_major_version' from source: facts 18662 1726867328.08353: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867328.08755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867328.09260: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867328.09423: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867328.09454: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867328.09693: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867328.09788: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867328.09834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867328.09965: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867328.09994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867328.10205: variable '__network_is_ostree' from source: set_fact 18662 1726867328.10215: Evaluated conditional (not __network_is_ostree is defined): False 18662 1726867328.10219: when evaluation is False, skipping this task 18662 1726867328.10221: _execute() done 18662 1726867328.10228: dumping result to json 18662 1726867328.10232: done dumping result, returning 18662 1726867328.10235: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-efab-a8ce-00000000036f] 18662 1726867328.10238: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000036f 18662 1726867328.10441: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000036f 18662 1726867328.10585: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18662 1726867328.10630: no more pending results, returning what we have 18662 1726867328.10633: results queue empty 18662 1726867328.10634: checking for any_errors_fatal 18662 1726867328.10642: done checking for any_errors_fatal 18662 1726867328.10643: checking for max_fail_percentage 18662 1726867328.10644: done checking for max_fail_percentage 18662 1726867328.10645: checking to see if all hosts have failed and the running result is not ok 18662 1726867328.10646: done checking to see if all hosts have failed 18662 1726867328.10647: getting the remaining hosts for this loop 18662 1726867328.10648: done getting the remaining hosts for this loop 18662 1726867328.10651: getting the next task for host managed_node2 18662 1726867328.10666: done getting next task for host managed_node2 18662 1726867328.10670: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 18662 1726867328.10672: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867328.10687: getting variables 18662 1726867328.10689: in VariableManager get_vars() 18662 1726867328.10730: Calling all_inventory to load vars for managed_node2 18662 1726867328.10733: Calling groups_inventory to load vars for managed_node2 18662 1726867328.10735: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867328.10747: Calling all_plugins_play to load vars for managed_node2 18662 1726867328.10750: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867328.10753: Calling groups_plugins_play to load vars for managed_node2 18662 1726867328.15425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867328.18696: done with get_vars() 18662 1726867328.18722: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:22:08 -0400 (0:00:00.133) 0:00:22.823 ****** 18662 1726867328.18818: entering _queue_task() for managed_node2/service_facts 18662 1726867328.19260: worker is 1 (out of 1 available) 18662 1726867328.19395: exiting _queue_task() for managed_node2/service_facts 18662 1726867328.19409: done queuing things up, now waiting for results queue to drain 18662 1726867328.19411: waiting for pending results... 18662 1726867328.19718: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 18662 1726867328.19934: in run() - task 0affcac9-a3a5-efab-a8ce-000000000371 18662 1726867328.19938: variable 'ansible_search_path' from source: unknown 18662 1726867328.19943: variable 'ansible_search_path' from source: unknown 18662 1726867328.19969: calling self._execute() 18662 1726867328.20076: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867328.20092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867328.20150: variable 'omit' from source: magic vars 18662 1726867328.20496: variable 'ansible_distribution_major_version' from source: facts 18662 1726867328.20515: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867328.20525: variable 'omit' from source: magic vars 18662 1726867328.20587: variable 'omit' from source: magic vars 18662 1726867328.20683: variable 'omit' from source: magic vars 18662 1726867328.20690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867328.20735: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867328.20776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867328.20806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867328.20907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867328.20913: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867328.20916: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867328.20918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867328.21050: Set connection var ansible_timeout to 10 18662 1726867328.21060: Set connection var ansible_connection to ssh 18662 1726867328.21075: Set connection var ansible_shell_executable to /bin/sh 18662 1726867328.21088: Set connection var ansible_shell_type to sh 18662 1726867328.21112: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867328.21131: Set connection var ansible_pipelining to False 18662 1726867328.21162: variable 'ansible_shell_executable' from source: unknown 18662 1726867328.21171: variable 'ansible_connection' from source: unknown 18662 1726867328.21182: variable 'ansible_module_compression' from source: unknown 18662 1726867328.21241: variable 'ansible_shell_type' from source: unknown 18662 1726867328.21249: variable 'ansible_shell_executable' from source: unknown 18662 1726867328.21251: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867328.21253: variable 'ansible_pipelining' from source: unknown 18662 1726867328.21256: variable 'ansible_timeout' from source: unknown 18662 1726867328.21259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867328.21706: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867328.21723: variable 'omit' from source: magic vars 18662 1726867328.21827: starting attempt loop 18662 1726867328.21830: running the handler 18662 1726867328.21832: _low_level_execute_command(): starting 18662 1726867328.21835: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867328.23345: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867328.23585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867328.23604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867328.23782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867328.25399: stdout chunk (state=3): >>>/root <<< 18662 1726867328.25496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867328.25508: stdout chunk (state=3): >>><<< 18662 1726867328.25521: stderr chunk (state=3): >>><<< 18662 1726867328.25712: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867328.25717: _low_level_execute_command(): starting 18662 1726867328.25720: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867328.25541-19717-209617153258187 `" && echo ansible-tmp-1726867328.25541-19717-209617153258187="` echo /root/.ansible/tmp/ansible-tmp-1726867328.25541-19717-209617153258187 `" ) && sleep 0' 18662 1726867328.27461: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867328.27749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867328.27753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867328.27755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867328.27757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867328.29708: stdout chunk (state=3): >>>ansible-tmp-1726867328.25541-19717-209617153258187=/root/.ansible/tmp/ansible-tmp-1726867328.25541-19717-209617153258187 <<< 18662 1726867328.29760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867328.29821: stderr chunk (state=3): >>><<< 18662 1726867328.30197: stdout chunk (state=3): >>><<< 18662 1726867328.30219: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867328.25541-19717-209617153258187=/root/.ansible/tmp/ansible-tmp-1726867328.25541-19717-209617153258187 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867328.30267: variable 'ansible_module_compression' from source: unknown 18662 1726867328.30322: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 18662 1726867328.30364: variable 'ansible_facts' from source: unknown 18662 1726867328.30560: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867328.25541-19717-209617153258187/AnsiballZ_service_facts.py 18662 1726867328.31921: Sending initial data 18662 1726867328.31924: Sent initial data (160 bytes) 18662 1726867328.32354: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867328.32358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867328.32683: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867328.32804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867328.32862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867328.34649: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18662 1726867328.34653: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867328.34767: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867328.34845: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmp6b595lu5 /root/.ansible/tmp/ansible-tmp-1726867328.25541-19717-209617153258187/AnsiballZ_service_facts.py <<< 18662 1726867328.34848: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867328.25541-19717-209617153258187/AnsiballZ_service_facts.py" <<< 18662 1726867328.34946: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmp6b595lu5" to remote "/root/.ansible/tmp/ansible-tmp-1726867328.25541-19717-209617153258187/AnsiballZ_service_facts.py" <<< 18662 1726867328.34949: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867328.25541-19717-209617153258187/AnsiballZ_service_facts.py" <<< 18662 1726867328.36501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867328.36528: stdout chunk (state=3): >>><<< 18662 1726867328.36539: stderr chunk (state=3): >>><<< 18662 1726867328.36792: done transferring module to remote 18662 1726867328.36796: _low_level_execute_command(): starting 18662 1726867328.36801: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867328.25541-19717-209617153258187/ /root/.ansible/tmp/ansible-tmp-1726867328.25541-19717-209617153258187/AnsiballZ_service_facts.py && sleep 0' 18662 1726867328.37826: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867328.37844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867328.37972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867328.38090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867328.38108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867328.38182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867328.40084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867328.40160: stderr chunk (state=3): >>><<< 18662 1726867328.40164: stdout chunk (state=3): >>><<< 18662 1726867328.40395: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867328.40405: _low_level_execute_command(): starting 18662 1726867328.40408: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867328.25541-19717-209617153258187/AnsiballZ_service_facts.py && sleep 0' 18662 1726867328.41617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867328.41883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867328.41908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867328.41928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867328.42017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867330.00690: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 18662 1726867330.01086: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 18662 1726867330.02373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867330.02381: stdout chunk (state=3): >>><<< 18662 1726867330.02383: stderr chunk (state=3): >>><<< 18662 1726867330.02486: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867330.04851: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867328.25541-19717-209617153258187/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867330.04856: _low_level_execute_command(): starting 18662 1726867330.04859: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867328.25541-19717-209617153258187/ > /dev/null 2>&1 && sleep 0' 18662 1726867330.06683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867330.06796: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867330.07081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867330.07084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867330.08991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867330.08994: stderr chunk (state=3): >>><<< 18662 1726867330.09071: stdout chunk (state=3): >>><<< 18662 1726867330.09074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867330.09076: handler run complete 18662 1726867330.09632: variable 'ansible_facts' from source: unknown 18662 1726867330.10238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867330.11508: variable 'ansible_facts' from source: unknown 18662 1726867330.12198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867330.12701: attempt loop complete, returning result 18662 1726867330.12751: _execute() done 18662 1726867330.12759: dumping result to json 18662 1726867330.13003: done dumping result, returning 18662 1726867330.13018: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-efab-a8ce-000000000371] 18662 1726867330.13028: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000371 18662 1726867330.16071: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000371 18662 1726867330.16074: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18662 1726867330.16202: no more pending results, returning what we have 18662 1726867330.16205: results queue empty 18662 1726867330.16206: checking for any_errors_fatal 18662 1726867330.16210: done checking for any_errors_fatal 18662 1726867330.16211: checking for max_fail_percentage 18662 1726867330.16213: done checking for max_fail_percentage 18662 1726867330.16214: checking to see if all hosts have failed and the running result is not ok 18662 1726867330.16215: done checking to see if all hosts have failed 18662 1726867330.16215: getting the remaining hosts for this loop 18662 1726867330.16217: done getting the remaining hosts for this loop 18662 1726867330.16220: getting the next task for host managed_node2 18662 1726867330.16226: done getting next task for host managed_node2 18662 1726867330.16229: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 18662 1726867330.16232: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867330.16241: getting variables 18662 1726867330.16243: in VariableManager get_vars() 18662 1726867330.16274: Calling all_inventory to load vars for managed_node2 18662 1726867330.16281: Calling groups_inventory to load vars for managed_node2 18662 1726867330.16286: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867330.16295: Calling all_plugins_play to load vars for managed_node2 18662 1726867330.16298: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867330.16300: Calling groups_plugins_play to load vars for managed_node2 18662 1726867330.20895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867330.24476: done with get_vars() 18662 1726867330.24506: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:22:10 -0400 (0:00:02.057) 0:00:24.881 ****** 18662 1726867330.24613: entering _queue_task() for managed_node2/package_facts 18662 1726867330.24976: worker is 1 (out of 1 available) 18662 1726867330.24991: exiting _queue_task() for managed_node2/package_facts 18662 1726867330.25003: done queuing things up, now waiting for results queue to drain 18662 1726867330.25004: waiting for pending results... 18662 1726867330.25517: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 18662 1726867330.25900: in run() - task 0affcac9-a3a5-efab-a8ce-000000000372 18662 1726867330.26084: variable 'ansible_search_path' from source: unknown 18662 1726867330.26088: variable 'ansible_search_path' from source: unknown 18662 1726867330.26090: calling self._execute() 18662 1726867330.26171: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867330.26294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867330.26313: variable 'omit' from source: magic vars 18662 1726867330.27324: variable 'ansible_distribution_major_version' from source: facts 18662 1726867330.27329: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867330.27332: variable 'omit' from source: magic vars 18662 1726867330.27333: variable 'omit' from source: magic vars 18662 1726867330.27335: variable 'omit' from source: magic vars 18662 1726867330.27386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867330.27435: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867330.27461: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867330.27485: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867330.27501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867330.27545: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867330.27553: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867330.27561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867330.27684: Set connection var ansible_timeout to 10 18662 1726867330.27738: Set connection var ansible_connection to ssh 18662 1726867330.27741: Set connection var ansible_shell_executable to /bin/sh 18662 1726867330.27744: Set connection var ansible_shell_type to sh 18662 1726867330.27746: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867330.27748: Set connection var ansible_pipelining to False 18662 1726867330.27764: variable 'ansible_shell_executable' from source: unknown 18662 1726867330.27773: variable 'ansible_connection' from source: unknown 18662 1726867330.27783: variable 'ansible_module_compression' from source: unknown 18662 1726867330.27790: variable 'ansible_shell_type' from source: unknown 18662 1726867330.27797: variable 'ansible_shell_executable' from source: unknown 18662 1726867330.27803: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867330.27813: variable 'ansible_pipelining' from source: unknown 18662 1726867330.27846: variable 'ansible_timeout' from source: unknown 18662 1726867330.27848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867330.28036: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867330.28052: variable 'omit' from source: magic vars 18662 1726867330.28173: starting attempt loop 18662 1726867330.28179: running the handler 18662 1726867330.28182: _low_level_execute_command(): starting 18662 1726867330.28184: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867330.28896: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867330.28955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867330.28981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867330.28999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867330.29080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867330.31028: stdout chunk (state=3): >>>/root <<< 18662 1726867330.31183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867330.31187: stdout chunk (state=3): >>><<< 18662 1726867330.31190: stderr chunk (state=3): >>><<< 18662 1726867330.31192: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867330.31194: _low_level_execute_command(): starting 18662 1726867330.31198: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867330.3112333-19844-116640828989290 `" && echo ansible-tmp-1726867330.3112333-19844-116640828989290="` echo /root/.ansible/tmp/ansible-tmp-1726867330.3112333-19844-116640828989290 `" ) && sleep 0' 18662 1726867330.32483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867330.32486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867330.32489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867330.32491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867330.32528: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867330.32695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867330.32744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867330.34756: stdout chunk (state=3): >>>ansible-tmp-1726867330.3112333-19844-116640828989290=/root/.ansible/tmp/ansible-tmp-1726867330.3112333-19844-116640828989290 <<< 18662 1726867330.34986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867330.34990: stdout chunk (state=3): >>><<< 18662 1726867330.34998: stderr chunk (state=3): >>><<< 18662 1726867330.35023: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867330.3112333-19844-116640828989290=/root/.ansible/tmp/ansible-tmp-1726867330.3112333-19844-116640828989290 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867330.35129: variable 'ansible_module_compression' from source: unknown 18662 1726867330.35132: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 18662 1726867330.35181: variable 'ansible_facts' from source: unknown 18662 1726867330.35882: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867330.3112333-19844-116640828989290/AnsiballZ_package_facts.py 18662 1726867330.36218: Sending initial data 18662 1726867330.36221: Sent initial data (162 bytes) 18662 1726867330.37244: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867330.37308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867330.37484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867330.37488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867330.37490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867330.37501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867330.37590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867330.39310: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867330.39389: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867330.39466: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpmhrrgbft /root/.ansible/tmp/ansible-tmp-1726867330.3112333-19844-116640828989290/AnsiballZ_package_facts.py <<< 18662 1726867330.39470: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867330.3112333-19844-116640828989290/AnsiballZ_package_facts.py" <<< 18662 1726867330.39717: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpmhrrgbft" to remote "/root/.ansible/tmp/ansible-tmp-1726867330.3112333-19844-116640828989290/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867330.3112333-19844-116640828989290/AnsiballZ_package_facts.py" <<< 18662 1726867330.42421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867330.42613: stderr chunk (state=3): >>><<< 18662 1726867330.42616: stdout chunk (state=3): >>><<< 18662 1726867330.42618: done transferring module to remote 18662 1726867330.42621: _low_level_execute_command(): starting 18662 1726867330.42623: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867330.3112333-19844-116640828989290/ /root/.ansible/tmp/ansible-tmp-1726867330.3112333-19844-116640828989290/AnsiballZ_package_facts.py && sleep 0' 18662 1726867330.43867: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867330.43968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867330.44218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867330.44257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867330.46183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867330.46186: stdout chunk (state=3): >>><<< 18662 1726867330.46189: stderr chunk (state=3): >>><<< 18662 1726867330.46196: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867330.46198: _low_level_execute_command(): starting 18662 1726867330.46200: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867330.3112333-19844-116640828989290/AnsiballZ_package_facts.py && sleep 0' 18662 1726867330.47681: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867330.47685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867330.47687: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867330.47690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867330.47693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867330.47695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867330.47697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867330.47957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867330.48055: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867330.92736: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 18662 1726867330.92790: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 18662 1726867330.93191: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 18662 1726867330.95020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867330.95023: stdout chunk (state=3): >>><<< 18662 1726867330.95026: stderr chunk (state=3): >>><<< 18662 1726867330.95284: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867331.01033: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867330.3112333-19844-116640828989290/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867331.01193: _low_level_execute_command(): starting 18662 1726867331.01197: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867330.3112333-19844-116640828989290/ > /dev/null 2>&1 && sleep 0' 18662 1726867331.02926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867331.03068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867331.03076: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867331.03146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867331.05129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867331.05135: stdout chunk (state=3): >>><<< 18662 1726867331.05153: stderr chunk (state=3): >>><<< 18662 1726867331.05166: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867331.05176: handler run complete 18662 1726867331.07687: variable 'ansible_facts' from source: unknown 18662 1726867331.08195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867331.13516: variable 'ansible_facts' from source: unknown 18662 1726867331.14731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867331.16317: attempt loop complete, returning result 18662 1726867331.16328: _execute() done 18662 1726867331.16331: dumping result to json 18662 1726867331.16730: done dumping result, returning 18662 1726867331.16740: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-efab-a8ce-000000000372] 18662 1726867331.16745: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000372 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18662 1726867331.21682: no more pending results, returning what we have 18662 1726867331.21685: results queue empty 18662 1726867331.21686: checking for any_errors_fatal 18662 1726867331.21694: done checking for any_errors_fatal 18662 1726867331.21695: checking for max_fail_percentage 18662 1726867331.21697: done checking for max_fail_percentage 18662 1726867331.21698: checking to see if all hosts have failed and the running result is not ok 18662 1726867331.21699: done checking to see if all hosts have failed 18662 1726867331.21701: getting the remaining hosts for this loop 18662 1726867331.21702: done getting the remaining hosts for this loop 18662 1726867331.21706: getting the next task for host managed_node2 18662 1726867331.21712: done getting next task for host managed_node2 18662 1726867331.21716: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18662 1726867331.21718: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867331.21728: getting variables 18662 1726867331.21732: in VariableManager get_vars() 18662 1726867331.21764: Calling all_inventory to load vars for managed_node2 18662 1726867331.21767: Calling groups_inventory to load vars for managed_node2 18662 1726867331.21769: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867331.21854: Calling all_plugins_play to load vars for managed_node2 18662 1726867331.21889: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867331.21896: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000372 18662 1726867331.21899: WORKER PROCESS EXITING 18662 1726867331.21931: Calling groups_plugins_play to load vars for managed_node2 18662 1726867331.25089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867331.27019: done with get_vars() 18662 1726867331.27057: done getting variables 18662 1726867331.27158: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:22:11 -0400 (0:00:01.025) 0:00:25.907 ****** 18662 1726867331.27194: entering _queue_task() for managed_node2/debug 18662 1726867331.28098: worker is 1 (out of 1 available) 18662 1726867331.28181: exiting _queue_task() for managed_node2/debug 18662 1726867331.28198: done queuing things up, now waiting for results queue to drain 18662 1726867331.28200: waiting for pending results... 18662 1726867331.28894: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 18662 1726867331.28899: in run() - task 0affcac9-a3a5-efab-a8ce-00000000003d 18662 1726867331.28902: variable 'ansible_search_path' from source: unknown 18662 1726867331.28905: variable 'ansible_search_path' from source: unknown 18662 1726867331.28908: calling self._execute() 18662 1726867331.29164: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867331.29179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867331.29194: variable 'omit' from source: magic vars 18662 1726867331.29967: variable 'ansible_distribution_major_version' from source: facts 18662 1726867331.29988: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867331.29999: variable 'omit' from source: magic vars 18662 1726867331.30040: variable 'omit' from source: magic vars 18662 1726867331.30482: variable 'network_provider' from source: set_fact 18662 1726867331.30486: variable 'omit' from source: magic vars 18662 1726867331.30489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867331.30492: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867331.30494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867331.30498: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867331.30501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867331.30883: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867331.30886: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867331.30889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867331.30892: Set connection var ansible_timeout to 10 18662 1726867331.30894: Set connection var ansible_connection to ssh 18662 1726867331.30896: Set connection var ansible_shell_executable to /bin/sh 18662 1726867331.30898: Set connection var ansible_shell_type to sh 18662 1726867331.30899: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867331.30901: Set connection var ansible_pipelining to False 18662 1726867331.30903: variable 'ansible_shell_executable' from source: unknown 18662 1726867331.30905: variable 'ansible_connection' from source: unknown 18662 1726867331.30907: variable 'ansible_module_compression' from source: unknown 18662 1726867331.30909: variable 'ansible_shell_type' from source: unknown 18662 1726867331.30910: variable 'ansible_shell_executable' from source: unknown 18662 1726867331.30912: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867331.30914: variable 'ansible_pipelining' from source: unknown 18662 1726867331.30916: variable 'ansible_timeout' from source: unknown 18662 1726867331.30918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867331.31212: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867331.31482: variable 'omit' from source: magic vars 18662 1726867331.31486: starting attempt loop 18662 1726867331.31489: running the handler 18662 1726867331.31492: handler run complete 18662 1726867331.31495: attempt loop complete, returning result 18662 1726867331.31497: _execute() done 18662 1726867331.31499: dumping result to json 18662 1726867331.31502: done dumping result, returning 18662 1726867331.31505: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-efab-a8ce-00000000003d] 18662 1726867331.31508: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000003d 18662 1726867331.31606: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000003d 18662 1726867331.31614: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 18662 1726867331.31676: no more pending results, returning what we have 18662 1726867331.31680: results queue empty 18662 1726867331.31681: checking for any_errors_fatal 18662 1726867331.31691: done checking for any_errors_fatal 18662 1726867331.31692: checking for max_fail_percentage 18662 1726867331.31693: done checking for max_fail_percentage 18662 1726867331.31695: checking to see if all hosts have failed and the running result is not ok 18662 1726867331.31696: done checking to see if all hosts have failed 18662 1726867331.31696: getting the remaining hosts for this loop 18662 1726867331.31698: done getting the remaining hosts for this loop 18662 1726867331.31701: getting the next task for host managed_node2 18662 1726867331.31707: done getting next task for host managed_node2 18662 1726867331.31826: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18662 1726867331.31829: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867331.31840: getting variables 18662 1726867331.31842: in VariableManager get_vars() 18662 1726867331.31876: Calling all_inventory to load vars for managed_node2 18662 1726867331.31881: Calling groups_inventory to load vars for managed_node2 18662 1726867331.31883: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867331.32013: Calling all_plugins_play to load vars for managed_node2 18662 1726867331.32017: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867331.32021: Calling groups_plugins_play to load vars for managed_node2 18662 1726867331.35018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867331.36633: done with get_vars() 18662 1726867331.36656: done getting variables 18662 1726867331.36716: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:22:11 -0400 (0:00:00.095) 0:00:26.003 ****** 18662 1726867331.36761: entering _queue_task() for managed_node2/fail 18662 1726867331.37372: worker is 1 (out of 1 available) 18662 1726867331.37386: exiting _queue_task() for managed_node2/fail 18662 1726867331.37396: done queuing things up, now waiting for results queue to drain 18662 1726867331.37397: waiting for pending results... 18662 1726867331.37892: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18662 1726867331.38149: in run() - task 0affcac9-a3a5-efab-a8ce-00000000003e 18662 1726867331.38172: variable 'ansible_search_path' from source: unknown 18662 1726867331.38175: variable 'ansible_search_path' from source: unknown 18662 1726867331.38219: calling self._execute() 18662 1726867331.38330: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867331.38344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867331.38373: variable 'omit' from source: magic vars 18662 1726867331.38795: variable 'ansible_distribution_major_version' from source: facts 18662 1726867331.38806: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867331.38951: variable 'network_state' from source: role '' defaults 18662 1726867331.38960: Evaluated conditional (network_state != {}): False 18662 1726867331.38963: when evaluation is False, skipping this task 18662 1726867331.38972: _execute() done 18662 1726867331.38975: dumping result to json 18662 1726867331.38979: done dumping result, returning 18662 1726867331.38987: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-efab-a8ce-00000000003e] 18662 1726867331.38998: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000003e 18662 1726867331.39088: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000003e 18662 1726867331.39091: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18662 1726867331.39140: no more pending results, returning what we have 18662 1726867331.39144: results queue empty 18662 1726867331.39145: checking for any_errors_fatal 18662 1726867331.39153: done checking for any_errors_fatal 18662 1726867331.39154: checking for max_fail_percentage 18662 1726867331.39155: done checking for max_fail_percentage 18662 1726867331.39156: checking to see if all hosts have failed and the running result is not ok 18662 1726867331.39157: done checking to see if all hosts have failed 18662 1726867331.39158: getting the remaining hosts for this loop 18662 1726867331.39159: done getting the remaining hosts for this loop 18662 1726867331.39162: getting the next task for host managed_node2 18662 1726867331.39168: done getting next task for host managed_node2 18662 1726867331.39171: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18662 1726867331.39174: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867331.39190: getting variables 18662 1726867331.39192: in VariableManager get_vars() 18662 1726867331.39227: Calling all_inventory to load vars for managed_node2 18662 1726867331.39229: Calling groups_inventory to load vars for managed_node2 18662 1726867331.39232: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867331.39243: Calling all_plugins_play to load vars for managed_node2 18662 1726867331.39245: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867331.39248: Calling groups_plugins_play to load vars for managed_node2 18662 1726867331.42222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867331.46139: done with get_vars() 18662 1726867331.46269: done getting variables 18662 1726867331.46462: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:22:11 -0400 (0:00:00.097) 0:00:26.100 ****** 18662 1726867331.46557: entering _queue_task() for managed_node2/fail 18662 1726867331.47165: worker is 1 (out of 1 available) 18662 1726867331.47176: exiting _queue_task() for managed_node2/fail 18662 1726867331.47385: done queuing things up, now waiting for results queue to drain 18662 1726867331.47390: waiting for pending results... 18662 1726867331.48083: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18662 1726867331.48435: in run() - task 0affcac9-a3a5-efab-a8ce-00000000003f 18662 1726867331.48451: variable 'ansible_search_path' from source: unknown 18662 1726867331.48455: variable 'ansible_search_path' from source: unknown 18662 1726867331.48727: calling self._execute() 18662 1726867331.48995: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867331.49000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867331.49003: variable 'omit' from source: magic vars 18662 1726867331.49616: variable 'ansible_distribution_major_version' from source: facts 18662 1726867331.49620: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867331.49952: variable 'network_state' from source: role '' defaults 18662 1726867331.49955: Evaluated conditional (network_state != {}): False 18662 1726867331.49959: when evaluation is False, skipping this task 18662 1726867331.49961: _execute() done 18662 1726867331.49963: dumping result to json 18662 1726867331.49964: done dumping result, returning 18662 1726867331.49967: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-efab-a8ce-00000000003f] 18662 1726867331.49971: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000003f 18662 1726867331.50162: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000003f 18662 1726867331.50166: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18662 1726867331.50243: no more pending results, returning what we have 18662 1726867331.50248: results queue empty 18662 1726867331.50249: checking for any_errors_fatal 18662 1726867331.50262: done checking for any_errors_fatal 18662 1726867331.50263: checking for max_fail_percentage 18662 1726867331.50265: done checking for max_fail_percentage 18662 1726867331.50269: checking to see if all hosts have failed and the running result is not ok 18662 1726867331.50269: done checking to see if all hosts have failed 18662 1726867331.50270: getting the remaining hosts for this loop 18662 1726867331.50272: done getting the remaining hosts for this loop 18662 1726867331.50275: getting the next task for host managed_node2 18662 1726867331.50284: done getting next task for host managed_node2 18662 1726867331.50288: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18662 1726867331.50291: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867331.50316: getting variables 18662 1726867331.50318: in VariableManager get_vars() 18662 1726867331.50360: Calling all_inventory to load vars for managed_node2 18662 1726867331.50362: Calling groups_inventory to load vars for managed_node2 18662 1726867331.50368: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867331.50726: Calling all_plugins_play to load vars for managed_node2 18662 1726867331.50731: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867331.50735: Calling groups_plugins_play to load vars for managed_node2 18662 1726867331.53017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867331.55462: done with get_vars() 18662 1726867331.55480: done getting variables 18662 1726867331.55524: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:22:11 -0400 (0:00:00.089) 0:00:26.190 ****** 18662 1726867331.55545: entering _queue_task() for managed_node2/fail 18662 1726867331.55781: worker is 1 (out of 1 available) 18662 1726867331.55795: exiting _queue_task() for managed_node2/fail 18662 1726867331.55806: done queuing things up, now waiting for results queue to drain 18662 1726867331.55808: waiting for pending results... 18662 1726867331.56001: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18662 1726867331.56067: in run() - task 0affcac9-a3a5-efab-a8ce-000000000040 18662 1726867331.56080: variable 'ansible_search_path' from source: unknown 18662 1726867331.56084: variable 'ansible_search_path' from source: unknown 18662 1726867331.56116: calling self._execute() 18662 1726867331.56187: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867331.56193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867331.56202: variable 'omit' from source: magic vars 18662 1726867331.56479: variable 'ansible_distribution_major_version' from source: facts 18662 1726867331.56489: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867331.56608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867331.59259: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867331.59505: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867331.59562: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867331.59596: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867331.59760: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867331.59764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867331.59767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867331.59770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867331.59838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867331.59853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867331.60170: variable 'ansible_distribution_major_version' from source: facts 18662 1726867331.60195: Evaluated conditional (ansible_distribution_major_version | int > 9): True 18662 1726867331.60546: variable 'ansible_distribution' from source: facts 18662 1726867331.60549: variable '__network_rh_distros' from source: role '' defaults 18662 1726867331.60562: Evaluated conditional (ansible_distribution in __network_rh_distros): True 18662 1726867331.61305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867331.61345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867331.61409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867331.61451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867331.61480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867331.61534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867331.61584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867331.61644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867331.61648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867331.61665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867331.61706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867331.61750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867331.61753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867331.61792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867331.61805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867331.62162: variable 'network_connections' from source: play vars 18662 1726867331.62171: variable 'profile' from source: play vars 18662 1726867331.62248: variable 'profile' from source: play vars 18662 1726867331.62251: variable 'interface' from source: set_fact 18662 1726867331.62316: variable 'interface' from source: set_fact 18662 1726867331.62382: variable 'network_state' from source: role '' defaults 18662 1726867331.62419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867331.62591: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867331.62663: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867331.62666: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867331.62691: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867331.62780: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867331.62791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867331.62830: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867331.62854: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867331.62859: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 18662 1726867331.62861: when evaluation is False, skipping this task 18662 1726867331.62864: _execute() done 18662 1726867331.62866: dumping result to json 18662 1726867331.62980: done dumping result, returning 18662 1726867331.62984: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-efab-a8ce-000000000040] 18662 1726867331.62986: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000040 18662 1726867331.63045: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000040 18662 1726867331.63047: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 18662 1726867331.63098: no more pending results, returning what we have 18662 1726867331.63102: results queue empty 18662 1726867331.63102: checking for any_errors_fatal 18662 1726867331.63109: done checking for any_errors_fatal 18662 1726867331.63110: checking for max_fail_percentage 18662 1726867331.63111: done checking for max_fail_percentage 18662 1726867331.63112: checking to see if all hosts have failed and the running result is not ok 18662 1726867331.63113: done checking to see if all hosts have failed 18662 1726867331.63113: getting the remaining hosts for this loop 18662 1726867331.63115: done getting the remaining hosts for this loop 18662 1726867331.63120: getting the next task for host managed_node2 18662 1726867331.63126: done getting next task for host managed_node2 18662 1726867331.63130: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18662 1726867331.63132: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867331.63144: getting variables 18662 1726867331.63145: in VariableManager get_vars() 18662 1726867331.63183: Calling all_inventory to load vars for managed_node2 18662 1726867331.63185: Calling groups_inventory to load vars for managed_node2 18662 1726867331.63187: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867331.63196: Calling all_plugins_play to load vars for managed_node2 18662 1726867331.63199: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867331.63201: Calling groups_plugins_play to load vars for managed_node2 18662 1726867331.65084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867331.66449: done with get_vars() 18662 1726867331.66466: done getting variables 18662 1726867331.66516: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:22:11 -0400 (0:00:00.109) 0:00:26.300 ****** 18662 1726867331.66537: entering _queue_task() for managed_node2/dnf 18662 1726867331.66767: worker is 1 (out of 1 available) 18662 1726867331.66783: exiting _queue_task() for managed_node2/dnf 18662 1726867331.66794: done queuing things up, now waiting for results queue to drain 18662 1726867331.66795: waiting for pending results... 18662 1726867331.66971: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18662 1726867331.67040: in run() - task 0affcac9-a3a5-efab-a8ce-000000000041 18662 1726867331.67052: variable 'ansible_search_path' from source: unknown 18662 1726867331.67055: variable 'ansible_search_path' from source: unknown 18662 1726867331.67086: calling self._execute() 18662 1726867331.67160: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867331.67163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867331.67174: variable 'omit' from source: magic vars 18662 1726867331.67456: variable 'ansible_distribution_major_version' from source: facts 18662 1726867331.67464: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867331.67652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867331.70230: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867331.70274: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867331.70307: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867331.70341: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867331.70369: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867331.70446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867331.70469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867331.70488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867331.70515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867331.70530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867331.70613: variable 'ansible_distribution' from source: facts 18662 1726867331.70618: variable 'ansible_distribution_major_version' from source: facts 18662 1726867331.70637: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 18662 1726867331.70734: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867331.70820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867331.70838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867331.70858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867331.70885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867331.70896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867331.70926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867331.70943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867331.70962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867331.70990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867331.71001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867331.71030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867331.71046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867331.71065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867331.71092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867331.71102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867331.71205: variable 'network_connections' from source: play vars 18662 1726867331.71216: variable 'profile' from source: play vars 18662 1726867331.71264: variable 'profile' from source: play vars 18662 1726867331.71267: variable 'interface' from source: set_fact 18662 1726867331.71315: variable 'interface' from source: set_fact 18662 1726867331.71363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867331.71494: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867331.71521: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867331.71543: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867331.71569: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867331.71683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867331.71687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867331.71697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867331.71699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867331.71902: variable '__network_team_connections_defined' from source: role '' defaults 18662 1726867331.72261: variable 'network_connections' from source: play vars 18662 1726867331.72265: variable 'profile' from source: play vars 18662 1726867331.72326: variable 'profile' from source: play vars 18662 1726867331.72329: variable 'interface' from source: set_fact 18662 1726867331.72387: variable 'interface' from source: set_fact 18662 1726867331.72414: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18662 1726867331.72417: when evaluation is False, skipping this task 18662 1726867331.72420: _execute() done 18662 1726867331.72422: dumping result to json 18662 1726867331.72424: done dumping result, returning 18662 1726867331.72456: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-efab-a8ce-000000000041] 18662 1726867331.72484: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000041 18662 1726867331.72550: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000041 18662 1726867331.72553: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18662 1726867331.72636: no more pending results, returning what we have 18662 1726867331.72639: results queue empty 18662 1726867331.72640: checking for any_errors_fatal 18662 1726867331.72646: done checking for any_errors_fatal 18662 1726867331.72647: checking for max_fail_percentage 18662 1726867331.72649: done checking for max_fail_percentage 18662 1726867331.72649: checking to see if all hosts have failed and the running result is not ok 18662 1726867331.72650: done checking to see if all hosts have failed 18662 1726867331.72651: getting the remaining hosts for this loop 18662 1726867331.72652: done getting the remaining hosts for this loop 18662 1726867331.72655: getting the next task for host managed_node2 18662 1726867331.72661: done getting next task for host managed_node2 18662 1726867331.72780: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18662 1726867331.72782: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867331.72795: getting variables 18662 1726867331.72796: in VariableManager get_vars() 18662 1726867331.72834: Calling all_inventory to load vars for managed_node2 18662 1726867331.72836: Calling groups_inventory to load vars for managed_node2 18662 1726867331.72839: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867331.72847: Calling all_plugins_play to load vars for managed_node2 18662 1726867331.72850: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867331.72853: Calling groups_plugins_play to load vars for managed_node2 18662 1726867331.74573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867331.76401: done with get_vars() 18662 1726867331.76426: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18662 1726867331.76518: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:22:11 -0400 (0:00:00.100) 0:00:26.400 ****** 18662 1726867331.76547: entering _queue_task() for managed_node2/yum 18662 1726867331.76935: worker is 1 (out of 1 available) 18662 1726867331.76947: exiting _queue_task() for managed_node2/yum 18662 1726867331.76959: done queuing things up, now waiting for results queue to drain 18662 1726867331.76960: waiting for pending results... 18662 1726867331.77358: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18662 1726867331.77363: in run() - task 0affcac9-a3a5-efab-a8ce-000000000042 18662 1726867331.77366: variable 'ansible_search_path' from source: unknown 18662 1726867331.77368: variable 'ansible_search_path' from source: unknown 18662 1726867331.77371: calling self._execute() 18662 1726867331.77374: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867331.77386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867331.77392: variable 'omit' from source: magic vars 18662 1726867331.77738: variable 'ansible_distribution_major_version' from source: facts 18662 1726867331.77748: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867331.77924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867331.80191: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867331.80218: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867331.80253: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867331.80299: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867331.80336: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867331.80405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867331.80524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867331.80527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867331.80530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867331.80560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867331.80725: variable 'ansible_distribution_major_version' from source: facts 18662 1726867331.80728: Evaluated conditional (ansible_distribution_major_version | int < 8): False 18662 1726867331.80730: when evaluation is False, skipping this task 18662 1726867331.80737: _execute() done 18662 1726867331.80738: dumping result to json 18662 1726867331.80740: done dumping result, returning 18662 1726867331.80742: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-efab-a8ce-000000000042] 18662 1726867331.80745: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000042 18662 1726867331.80807: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000042 18662 1726867331.80810: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 18662 1726867331.80865: no more pending results, returning what we have 18662 1726867331.80872: results queue empty 18662 1726867331.80873: checking for any_errors_fatal 18662 1726867331.80887: done checking for any_errors_fatal 18662 1726867331.80888: checking for max_fail_percentage 18662 1726867331.80889: done checking for max_fail_percentage 18662 1726867331.80890: checking to see if all hosts have failed and the running result is not ok 18662 1726867331.80891: done checking to see if all hosts have failed 18662 1726867331.80891: getting the remaining hosts for this loop 18662 1726867331.80893: done getting the remaining hosts for this loop 18662 1726867331.80896: getting the next task for host managed_node2 18662 1726867331.80902: done getting next task for host managed_node2 18662 1726867331.80906: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18662 1726867331.80907: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867331.80920: getting variables 18662 1726867331.80922: in VariableManager get_vars() 18662 1726867331.80968: Calling all_inventory to load vars for managed_node2 18662 1726867331.80971: Calling groups_inventory to load vars for managed_node2 18662 1726867331.80973: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867331.81071: Calling all_plugins_play to load vars for managed_node2 18662 1726867331.81075: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867331.81080: Calling groups_plugins_play to load vars for managed_node2 18662 1726867331.83205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867331.85147: done with get_vars() 18662 1726867331.85170: done getting variables 18662 1726867331.85294: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:22:11 -0400 (0:00:00.087) 0:00:26.488 ****** 18662 1726867331.85324: entering _queue_task() for managed_node2/fail 18662 1726867331.85844: worker is 1 (out of 1 available) 18662 1726867331.85857: exiting _queue_task() for managed_node2/fail 18662 1726867331.85869: done queuing things up, now waiting for results queue to drain 18662 1726867331.85870: waiting for pending results... 18662 1726867331.86306: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18662 1726867331.86310: in run() - task 0affcac9-a3a5-efab-a8ce-000000000043 18662 1726867331.86313: variable 'ansible_search_path' from source: unknown 18662 1726867331.86315: variable 'ansible_search_path' from source: unknown 18662 1726867331.86318: calling self._execute() 18662 1726867331.86489: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867331.86493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867331.86496: variable 'omit' from source: magic vars 18662 1726867331.86855: variable 'ansible_distribution_major_version' from source: facts 18662 1726867331.86859: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867331.86963: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867331.87202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867331.90936: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867331.91003: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867331.91045: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867331.91081: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867331.91107: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867331.91244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867331.91248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867331.91250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867331.91291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867331.91305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867331.91349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867331.91381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867331.91430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867331.91568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867331.91571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867331.91574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867331.91576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867331.91580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867331.91616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867331.91630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867331.91807: variable 'network_connections' from source: play vars 18662 1726867331.91819: variable 'profile' from source: play vars 18662 1726867331.91914: variable 'profile' from source: play vars 18662 1726867331.91917: variable 'interface' from source: set_fact 18662 1726867331.91961: variable 'interface' from source: set_fact 18662 1726867331.92082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867331.92275: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867331.92314: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867331.92347: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867331.92373: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867331.92449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867331.92472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867331.92548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867331.92552: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867331.92604: variable '__network_team_connections_defined' from source: role '' defaults 18662 1726867331.92955: variable 'network_connections' from source: play vars 18662 1726867331.92958: variable 'profile' from source: play vars 18662 1726867331.93250: variable 'profile' from source: play vars 18662 1726867331.93255: variable 'interface' from source: set_fact 18662 1726867331.93515: variable 'interface' from source: set_fact 18662 1726867331.93692: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18662 1726867331.93696: when evaluation is False, skipping this task 18662 1726867331.93699: _execute() done 18662 1726867331.93701: dumping result to json 18662 1726867331.93703: done dumping result, returning 18662 1726867331.93768: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-efab-a8ce-000000000043] 18662 1726867331.93782: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000043 18662 1726867331.94074: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000043 18662 1726867331.94079: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18662 1726867331.94184: no more pending results, returning what we have 18662 1726867331.94187: results queue empty 18662 1726867331.94188: checking for any_errors_fatal 18662 1726867331.94194: done checking for any_errors_fatal 18662 1726867331.94195: checking for max_fail_percentage 18662 1726867331.94196: done checking for max_fail_percentage 18662 1726867331.94197: checking to see if all hosts have failed and the running result is not ok 18662 1726867331.94198: done checking to see if all hosts have failed 18662 1726867331.94199: getting the remaining hosts for this loop 18662 1726867331.94200: done getting the remaining hosts for this loop 18662 1726867331.94203: getting the next task for host managed_node2 18662 1726867331.94212: done getting next task for host managed_node2 18662 1726867331.94216: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18662 1726867331.94218: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867331.94233: getting variables 18662 1726867331.94234: in VariableManager get_vars() 18662 1726867331.94270: Calling all_inventory to load vars for managed_node2 18662 1726867331.94272: Calling groups_inventory to load vars for managed_node2 18662 1726867331.94274: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867331.94286: Calling all_plugins_play to load vars for managed_node2 18662 1726867331.94288: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867331.94291: Calling groups_plugins_play to load vars for managed_node2 18662 1726867331.97873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867332.00087: done with get_vars() 18662 1726867332.00111: done getting variables 18662 1726867332.00184: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:22:12 -0400 (0:00:00.148) 0:00:26.637 ****** 18662 1726867332.00216: entering _queue_task() for managed_node2/package 18662 1726867332.00574: worker is 1 (out of 1 available) 18662 1726867332.00710: exiting _queue_task() for managed_node2/package 18662 1726867332.00721: done queuing things up, now waiting for results queue to drain 18662 1726867332.00722: waiting for pending results... 18662 1726867332.01097: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 18662 1726867332.01102: in run() - task 0affcac9-a3a5-efab-a8ce-000000000044 18662 1726867332.01105: variable 'ansible_search_path' from source: unknown 18662 1726867332.01107: variable 'ansible_search_path' from source: unknown 18662 1726867332.01112: calling self._execute() 18662 1726867332.01217: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867332.01223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867332.01233: variable 'omit' from source: magic vars 18662 1726867332.01647: variable 'ansible_distribution_major_version' from source: facts 18662 1726867332.01659: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867332.02183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867332.02189: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867332.02244: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867332.02365: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867332.02430: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867332.02618: variable 'network_packages' from source: role '' defaults 18662 1726867332.02882: variable '__network_provider_setup' from source: role '' defaults 18662 1726867332.02886: variable '__network_service_name_default_nm' from source: role '' defaults 18662 1726867332.02888: variable '__network_service_name_default_nm' from source: role '' defaults 18662 1726867332.02891: variable '__network_packages_default_nm' from source: role '' defaults 18662 1726867332.02961: variable '__network_packages_default_nm' from source: role '' defaults 18662 1726867332.03383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867332.05987: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867332.06085: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867332.06140: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867332.06190: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867332.06217: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867332.06320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867332.06383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867332.06406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867332.06446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867332.06472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867332.06727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867332.06752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867332.06783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867332.06837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867332.06849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867332.07683: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18662 1726867332.07686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867332.07752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867332.07788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867332.07832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867332.07911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867332.08049: variable 'ansible_python' from source: facts 18662 1726867332.08102: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18662 1726867332.08213: variable '__network_wpa_supplicant_required' from source: role '' defaults 18662 1726867332.08293: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18662 1726867332.08484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867332.08625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867332.08660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867332.08768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867332.08785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867332.08951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867332.09055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867332.09091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867332.09131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867332.09143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867332.09502: variable 'network_connections' from source: play vars 18662 1726867332.09600: variable 'profile' from source: play vars 18662 1726867332.09780: variable 'profile' from source: play vars 18662 1726867332.09786: variable 'interface' from source: set_fact 18662 1726867332.09890: variable 'interface' from source: set_fact 18662 1726867332.09981: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867332.10009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867332.10050: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867332.10098: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867332.10183: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867332.10903: variable 'network_connections' from source: play vars 18662 1726867332.10906: variable 'profile' from source: play vars 18662 1726867332.11099: variable 'profile' from source: play vars 18662 1726867332.11106: variable 'interface' from source: set_fact 18662 1726867332.11582: variable 'interface' from source: set_fact 18662 1726867332.11586: variable '__network_packages_default_wireless' from source: role '' defaults 18662 1726867332.11604: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867332.12464: variable 'network_connections' from source: play vars 18662 1726867332.12468: variable 'profile' from source: play vars 18662 1726867332.12665: variable 'profile' from source: play vars 18662 1726867332.12669: variable 'interface' from source: set_fact 18662 1726867332.12927: variable 'interface' from source: set_fact 18662 1726867332.12986: variable '__network_packages_default_team' from source: role '' defaults 18662 1726867332.13147: variable '__network_team_connections_defined' from source: role '' defaults 18662 1726867332.13850: variable 'network_connections' from source: play vars 18662 1726867332.13854: variable 'profile' from source: play vars 18662 1726867332.13918: variable 'profile' from source: play vars 18662 1726867332.13922: variable 'interface' from source: set_fact 18662 1726867332.14267: variable 'interface' from source: set_fact 18662 1726867332.14327: variable '__network_service_name_default_initscripts' from source: role '' defaults 18662 1726867332.14513: variable '__network_service_name_default_initscripts' from source: role '' defaults 18662 1726867332.14517: variable '__network_packages_default_initscripts' from source: role '' defaults 18662 1726867332.14574: variable '__network_packages_default_initscripts' from source: role '' defaults 18662 1726867332.14943: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18662 1726867332.15497: variable 'network_connections' from source: play vars 18662 1726867332.15500: variable 'profile' from source: play vars 18662 1726867332.15573: variable 'profile' from source: play vars 18662 1726867332.15576: variable 'interface' from source: set_fact 18662 1726867332.15645: variable 'interface' from source: set_fact 18662 1726867332.15653: variable 'ansible_distribution' from source: facts 18662 1726867332.15657: variable '__network_rh_distros' from source: role '' defaults 18662 1726867332.15678: variable 'ansible_distribution_major_version' from source: facts 18662 1726867332.15698: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18662 1726867332.15871: variable 'ansible_distribution' from source: facts 18662 1726867332.15874: variable '__network_rh_distros' from source: role '' defaults 18662 1726867332.15900: variable 'ansible_distribution_major_version' from source: facts 18662 1726867332.15919: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18662 1726867332.16095: variable 'ansible_distribution' from source: facts 18662 1726867332.16105: variable '__network_rh_distros' from source: role '' defaults 18662 1726867332.16118: variable 'ansible_distribution_major_version' from source: facts 18662 1726867332.16156: variable 'network_provider' from source: set_fact 18662 1726867332.16169: variable 'ansible_facts' from source: unknown 18662 1726867332.17024: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 18662 1726867332.17027: when evaluation is False, skipping this task 18662 1726867332.17029: _execute() done 18662 1726867332.17032: dumping result to json 18662 1726867332.17034: done dumping result, returning 18662 1726867332.17049: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-efab-a8ce-000000000044] 18662 1726867332.17052: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000044 18662 1726867332.17141: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000044 18662 1726867332.17144: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 18662 1726867332.17219: no more pending results, returning what we have 18662 1726867332.17223: results queue empty 18662 1726867332.17224: checking for any_errors_fatal 18662 1726867332.17230: done checking for any_errors_fatal 18662 1726867332.17231: checking for max_fail_percentage 18662 1726867332.17233: done checking for max_fail_percentage 18662 1726867332.17234: checking to see if all hosts have failed and the running result is not ok 18662 1726867332.17235: done checking to see if all hosts have failed 18662 1726867332.17235: getting the remaining hosts for this loop 18662 1726867332.17237: done getting the remaining hosts for this loop 18662 1726867332.17241: getting the next task for host managed_node2 18662 1726867332.17249: done getting next task for host managed_node2 18662 1726867332.17253: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18662 1726867332.17256: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867332.17270: getting variables 18662 1726867332.17272: in VariableManager get_vars() 18662 1726867332.17318: Calling all_inventory to load vars for managed_node2 18662 1726867332.17321: Calling groups_inventory to load vars for managed_node2 18662 1726867332.17323: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867332.17339: Calling all_plugins_play to load vars for managed_node2 18662 1726867332.17343: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867332.17346: Calling groups_plugins_play to load vars for managed_node2 18662 1726867332.19194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867332.21720: done with get_vars() 18662 1726867332.21751: done getting variables 18662 1726867332.21817: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:22:12 -0400 (0:00:00.216) 0:00:26.853 ****** 18662 1726867332.21850: entering _queue_task() for managed_node2/package 18662 1726867332.22211: worker is 1 (out of 1 available) 18662 1726867332.22230: exiting _queue_task() for managed_node2/package 18662 1726867332.22243: done queuing things up, now waiting for results queue to drain 18662 1726867332.22244: waiting for pending results... 18662 1726867332.22597: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18662 1726867332.22602: in run() - task 0affcac9-a3a5-efab-a8ce-000000000045 18662 1726867332.22605: variable 'ansible_search_path' from source: unknown 18662 1726867332.22607: variable 'ansible_search_path' from source: unknown 18662 1726867332.22633: calling self._execute() 18662 1726867332.22736: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867332.22749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867332.22764: variable 'omit' from source: magic vars 18662 1726867332.23137: variable 'ansible_distribution_major_version' from source: facts 18662 1726867332.23265: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867332.23291: variable 'network_state' from source: role '' defaults 18662 1726867332.23305: Evaluated conditional (network_state != {}): False 18662 1726867332.23315: when evaluation is False, skipping this task 18662 1726867332.23323: _execute() done 18662 1726867332.23330: dumping result to json 18662 1726867332.23337: done dumping result, returning 18662 1726867332.23348: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-efab-a8ce-000000000045] 18662 1726867332.23359: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000045 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18662 1726867332.23521: no more pending results, returning what we have 18662 1726867332.23525: results queue empty 18662 1726867332.23525: checking for any_errors_fatal 18662 1726867332.23538: done checking for any_errors_fatal 18662 1726867332.23539: checking for max_fail_percentage 18662 1726867332.23540: done checking for max_fail_percentage 18662 1726867332.23541: checking to see if all hosts have failed and the running result is not ok 18662 1726867332.23542: done checking to see if all hosts have failed 18662 1726867332.23543: getting the remaining hosts for this loop 18662 1726867332.23544: done getting the remaining hosts for this loop 18662 1726867332.23548: getting the next task for host managed_node2 18662 1726867332.23555: done getting next task for host managed_node2 18662 1726867332.23558: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18662 1726867332.23560: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867332.23572: getting variables 18662 1726867332.23574: in VariableManager get_vars() 18662 1726867332.23611: Calling all_inventory to load vars for managed_node2 18662 1726867332.23614: Calling groups_inventory to load vars for managed_node2 18662 1726867332.23616: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867332.23629: Calling all_plugins_play to load vars for managed_node2 18662 1726867332.23632: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867332.23636: Calling groups_plugins_play to load vars for managed_node2 18662 1726867332.24404: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000045 18662 1726867332.24408: WORKER PROCESS EXITING 18662 1726867332.26143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867332.27841: done with get_vars() 18662 1726867332.27874: done getting variables 18662 1726867332.27931: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:22:12 -0400 (0:00:00.061) 0:00:26.915 ****** 18662 1726867332.27959: entering _queue_task() for managed_node2/package 18662 1726867332.28331: worker is 1 (out of 1 available) 18662 1726867332.28343: exiting _queue_task() for managed_node2/package 18662 1726867332.28353: done queuing things up, now waiting for results queue to drain 18662 1726867332.28354: waiting for pending results... 18662 1726867332.28605: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18662 1726867332.28713: in run() - task 0affcac9-a3a5-efab-a8ce-000000000046 18662 1726867332.28723: variable 'ansible_search_path' from source: unknown 18662 1726867332.28741: variable 'ansible_search_path' from source: unknown 18662 1726867332.28857: calling self._execute() 18662 1726867332.28872: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867332.28879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867332.28890: variable 'omit' from source: magic vars 18662 1726867332.29360: variable 'ansible_distribution_major_version' from source: facts 18662 1726867332.29394: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867332.29678: variable 'network_state' from source: role '' defaults 18662 1726867332.29703: Evaluated conditional (network_state != {}): False 18662 1726867332.29732: when evaluation is False, skipping this task 18662 1726867332.29748: _execute() done 18662 1726867332.29756: dumping result to json 18662 1726867332.29763: done dumping result, returning 18662 1726867332.29773: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-efab-a8ce-000000000046] 18662 1726867332.29825: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000046 18662 1726867332.29944: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000046 18662 1726867332.29948: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18662 1726867332.29995: no more pending results, returning what we have 18662 1726867332.30000: results queue empty 18662 1726867332.30001: checking for any_errors_fatal 18662 1726867332.30006: done checking for any_errors_fatal 18662 1726867332.30007: checking for max_fail_percentage 18662 1726867332.30008: done checking for max_fail_percentage 18662 1726867332.30009: checking to see if all hosts have failed and the running result is not ok 18662 1726867332.30010: done checking to see if all hosts have failed 18662 1726867332.30011: getting the remaining hosts for this loop 18662 1726867332.30012: done getting the remaining hosts for this loop 18662 1726867332.30016: getting the next task for host managed_node2 18662 1726867332.30022: done getting next task for host managed_node2 18662 1726867332.30026: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18662 1726867332.30030: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867332.30165: getting variables 18662 1726867332.30167: in VariableManager get_vars() 18662 1726867332.30202: Calling all_inventory to load vars for managed_node2 18662 1726867332.30205: Calling groups_inventory to load vars for managed_node2 18662 1726867332.30211: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867332.30224: Calling all_plugins_play to load vars for managed_node2 18662 1726867332.30227: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867332.30230: Calling groups_plugins_play to load vars for managed_node2 18662 1726867332.31264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867332.36780: done with get_vars() 18662 1726867332.36796: done getting variables 18662 1726867332.36833: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:22:12 -0400 (0:00:00.088) 0:00:27.004 ****** 18662 1726867332.36852: entering _queue_task() for managed_node2/service 18662 1726867332.37108: worker is 1 (out of 1 available) 18662 1726867332.37128: exiting _queue_task() for managed_node2/service 18662 1726867332.37139: done queuing things up, now waiting for results queue to drain 18662 1726867332.37140: waiting for pending results... 18662 1726867332.37361: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18662 1726867332.37444: in run() - task 0affcac9-a3a5-efab-a8ce-000000000047 18662 1726867332.37454: variable 'ansible_search_path' from source: unknown 18662 1726867332.37458: variable 'ansible_search_path' from source: unknown 18662 1726867332.37487: calling self._execute() 18662 1726867332.37566: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867332.37573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867332.37584: variable 'omit' from source: magic vars 18662 1726867332.37865: variable 'ansible_distribution_major_version' from source: facts 18662 1726867332.37875: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867332.38012: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867332.38290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867332.41403: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867332.41521: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867332.41582: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867332.41601: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867332.41644: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867332.41785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867332.41789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867332.41831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867332.41894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867332.41982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867332.41986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867332.42008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867332.42102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867332.42133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867332.42159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867332.42215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867332.42250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867332.42280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867332.42423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867332.42428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867332.42572: variable 'network_connections' from source: play vars 18662 1726867332.42592: variable 'profile' from source: play vars 18662 1726867332.42686: variable 'profile' from source: play vars 18662 1726867332.42695: variable 'interface' from source: set_fact 18662 1726867332.42763: variable 'interface' from source: set_fact 18662 1726867332.42852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867332.43107: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867332.43113: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867332.43152: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867332.43191: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867332.43247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867332.43278: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867332.43322: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867332.43419: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867332.43423: variable '__network_team_connections_defined' from source: role '' defaults 18662 1726867332.43676: variable 'network_connections' from source: play vars 18662 1726867332.43696: variable 'profile' from source: play vars 18662 1726867332.43786: variable 'profile' from source: play vars 18662 1726867332.43796: variable 'interface' from source: set_fact 18662 1726867332.43869: variable 'interface' from source: set_fact 18662 1726867332.43912: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18662 1726867332.43920: when evaluation is False, skipping this task 18662 1726867332.43927: _execute() done 18662 1726867332.43936: dumping result to json 18662 1726867332.43964: done dumping result, returning 18662 1726867332.43974: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-efab-a8ce-000000000047] 18662 1726867332.44183: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000047 18662 1726867332.44263: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000047 18662 1726867332.44266: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18662 1726867332.44336: no more pending results, returning what we have 18662 1726867332.44341: results queue empty 18662 1726867332.44342: checking for any_errors_fatal 18662 1726867332.44349: done checking for any_errors_fatal 18662 1726867332.44350: checking for max_fail_percentage 18662 1726867332.44351: done checking for max_fail_percentage 18662 1726867332.44352: checking to see if all hosts have failed and the running result is not ok 18662 1726867332.44353: done checking to see if all hosts have failed 18662 1726867332.44354: getting the remaining hosts for this loop 18662 1726867332.44355: done getting the remaining hosts for this loop 18662 1726867332.44359: getting the next task for host managed_node2 18662 1726867332.44366: done getting next task for host managed_node2 18662 1726867332.44370: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18662 1726867332.44372: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867332.44388: getting variables 18662 1726867332.44389: in VariableManager get_vars() 18662 1726867332.44432: Calling all_inventory to load vars for managed_node2 18662 1726867332.44434: Calling groups_inventory to load vars for managed_node2 18662 1726867332.44437: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867332.44448: Calling all_plugins_play to load vars for managed_node2 18662 1726867332.44451: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867332.44454: Calling groups_plugins_play to load vars for managed_node2 18662 1726867332.46142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867332.47897: done with get_vars() 18662 1726867332.47920: done getting variables 18662 1726867332.48297: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:22:12 -0400 (0:00:00.114) 0:00:27.118 ****** 18662 1726867332.48327: entering _queue_task() for managed_node2/service 18662 1726867332.48897: worker is 1 (out of 1 available) 18662 1726867332.48916: exiting _queue_task() for managed_node2/service 18662 1726867332.48927: done queuing things up, now waiting for results queue to drain 18662 1726867332.48929: waiting for pending results... 18662 1726867332.49323: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18662 1726867332.49500: in run() - task 0affcac9-a3a5-efab-a8ce-000000000048 18662 1726867332.49504: variable 'ansible_search_path' from source: unknown 18662 1726867332.49506: variable 'ansible_search_path' from source: unknown 18662 1726867332.49508: calling self._execute() 18662 1726867332.49670: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867332.49675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867332.49688: variable 'omit' from source: magic vars 18662 1726867332.50262: variable 'ansible_distribution_major_version' from source: facts 18662 1726867332.50265: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867332.50500: variable 'network_provider' from source: set_fact 18662 1726867332.50505: variable 'network_state' from source: role '' defaults 18662 1726867332.50518: Evaluated conditional (network_provider == "nm" or network_state != {}): True 18662 1726867332.50524: variable 'omit' from source: magic vars 18662 1726867332.50563: variable 'omit' from source: magic vars 18662 1726867332.50605: variable 'network_service_name' from source: role '' defaults 18662 1726867332.50696: variable 'network_service_name' from source: role '' defaults 18662 1726867332.50841: variable '__network_provider_setup' from source: role '' defaults 18662 1726867332.50860: variable '__network_service_name_default_nm' from source: role '' defaults 18662 1726867332.51002: variable '__network_service_name_default_nm' from source: role '' defaults 18662 1726867332.51005: variable '__network_packages_default_nm' from source: role '' defaults 18662 1726867332.51045: variable '__network_packages_default_nm' from source: role '' defaults 18662 1726867332.51340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867332.55433: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867332.55618: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867332.55636: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867332.55740: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867332.55771: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867332.55861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867332.55899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867332.55933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867332.55985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867332.56014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867332.56165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867332.56168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867332.56171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867332.56245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867332.56270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867332.56687: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18662 1726867332.56730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867332.56782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867332.56836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867332.56882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867332.56904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867332.56993: variable 'ansible_python' from source: facts 18662 1726867332.57067: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18662 1726867332.57231: variable '__network_wpa_supplicant_required' from source: role '' defaults 18662 1726867332.57585: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18662 1726867332.57905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867332.57910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867332.57913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867332.57915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867332.57917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867332.58051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867332.58173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867332.58206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867332.58490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867332.58493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867332.58821: variable 'network_connections' from source: play vars 18662 1726867332.58824: variable 'profile' from source: play vars 18662 1726867332.58886: variable 'profile' from source: play vars 18662 1726867332.58897: variable 'interface' from source: set_fact 18662 1726867332.59111: variable 'interface' from source: set_fact 18662 1726867332.59681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867332.62204: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867332.62460: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867332.62725: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867332.62771: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867332.63200: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867332.63366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867332.63437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867332.63735: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867332.63788: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867332.64483: variable 'network_connections' from source: play vars 18662 1726867332.64657: variable 'profile' from source: play vars 18662 1726867332.64660: variable 'profile' from source: play vars 18662 1726867332.64663: variable 'interface' from source: set_fact 18662 1726867332.64811: variable 'interface' from source: set_fact 18662 1726867332.64914: variable '__network_packages_default_wireless' from source: role '' defaults 18662 1726867332.65105: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867332.65716: variable 'network_connections' from source: play vars 18662 1726867332.65727: variable 'profile' from source: play vars 18662 1726867332.65913: variable 'profile' from source: play vars 18662 1726867332.65917: variable 'interface' from source: set_fact 18662 1726867332.65980: variable 'interface' from source: set_fact 18662 1726867332.66049: variable '__network_packages_default_team' from source: role '' defaults 18662 1726867332.66347: variable '__network_team_connections_defined' from source: role '' defaults 18662 1726867332.66859: variable 'network_connections' from source: play vars 18662 1726867332.67173: variable 'profile' from source: play vars 18662 1726867332.67176: variable 'profile' from source: play vars 18662 1726867332.67180: variable 'interface' from source: set_fact 18662 1726867332.67257: variable 'interface' from source: set_fact 18662 1726867332.67356: variable '__network_service_name_default_initscripts' from source: role '' defaults 18662 1726867332.67551: variable '__network_service_name_default_initscripts' from source: role '' defaults 18662 1726867332.67564: variable '__network_packages_default_initscripts' from source: role '' defaults 18662 1726867332.67707: variable '__network_packages_default_initscripts' from source: role '' defaults 18662 1726867332.68375: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18662 1726867332.69181: variable 'network_connections' from source: play vars 18662 1726867332.69253: variable 'profile' from source: play vars 18662 1726867332.69319: variable 'profile' from source: play vars 18662 1726867332.69471: variable 'interface' from source: set_fact 18662 1726867332.69548: variable 'interface' from source: set_fact 18662 1726867332.69585: variable 'ansible_distribution' from source: facts 18662 1726867332.69783: variable '__network_rh_distros' from source: role '' defaults 18662 1726867332.69787: variable 'ansible_distribution_major_version' from source: facts 18662 1726867332.69789: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18662 1726867332.69981: variable 'ansible_distribution' from source: facts 18662 1726867332.70023: variable '__network_rh_distros' from source: role '' defaults 18662 1726867332.70090: variable 'ansible_distribution_major_version' from source: facts 18662 1726867332.70108: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18662 1726867332.70461: variable 'ansible_distribution' from source: facts 18662 1726867332.70471: variable '__network_rh_distros' from source: role '' defaults 18662 1726867332.70484: variable 'ansible_distribution_major_version' from source: facts 18662 1726867332.70529: variable 'network_provider' from source: set_fact 18662 1726867332.70783: variable 'omit' from source: magic vars 18662 1726867332.70787: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867332.70791: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867332.70794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867332.70802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867332.70892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867332.70929: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867332.71102: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867332.71105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867332.71217: Set connection var ansible_timeout to 10 18662 1726867332.71226: Set connection var ansible_connection to ssh 18662 1726867332.71237: Set connection var ansible_shell_executable to /bin/sh 18662 1726867332.71245: Set connection var ansible_shell_type to sh 18662 1726867332.71260: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867332.71270: Set connection var ansible_pipelining to False 18662 1726867332.71301: variable 'ansible_shell_executable' from source: unknown 18662 1726867332.71388: variable 'ansible_connection' from source: unknown 18662 1726867332.71397: variable 'ansible_module_compression' from source: unknown 18662 1726867332.71405: variable 'ansible_shell_type' from source: unknown 18662 1726867332.71416: variable 'ansible_shell_executable' from source: unknown 18662 1726867332.71429: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867332.71443: variable 'ansible_pipelining' from source: unknown 18662 1726867332.71451: variable 'ansible_timeout' from source: unknown 18662 1726867332.71459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867332.71694: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867332.71713: variable 'omit' from source: magic vars 18662 1726867332.71725: starting attempt loop 18662 1726867332.71760: running the handler 18662 1726867332.71973: variable 'ansible_facts' from source: unknown 18662 1726867332.73467: _low_level_execute_command(): starting 18662 1726867332.73602: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867332.74555: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867332.74568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867332.74586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867332.74605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867332.74705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867332.74738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867332.74813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867332.76528: stdout chunk (state=3): >>>/root <<< 18662 1726867332.76636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867332.76643: stderr chunk (state=3): >>><<< 18662 1726867332.76646: stdout chunk (state=3): >>><<< 18662 1726867332.76791: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867332.76794: _low_level_execute_command(): starting 18662 1726867332.76797: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867332.7675567-19923-226594019579482 `" && echo ansible-tmp-1726867332.7675567-19923-226594019579482="` echo /root/.ansible/tmp/ansible-tmp-1726867332.7675567-19923-226594019579482 `" ) && sleep 0' 18662 1726867332.78093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867332.78159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867332.78168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867332.78242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867332.78340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867332.80294: stdout chunk (state=3): >>>ansible-tmp-1726867332.7675567-19923-226594019579482=/root/.ansible/tmp/ansible-tmp-1726867332.7675567-19923-226594019579482 <<< 18662 1726867332.80384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867332.80438: stderr chunk (state=3): >>><<< 18662 1726867332.80478: stdout chunk (state=3): >>><<< 18662 1726867332.80504: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867332.7675567-19923-226594019579482=/root/.ansible/tmp/ansible-tmp-1726867332.7675567-19923-226594019579482 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867332.80541: variable 'ansible_module_compression' from source: unknown 18662 1726867332.80601: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 18662 1726867332.80661: variable 'ansible_facts' from source: unknown 18662 1726867332.80873: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867332.7675567-19923-226594019579482/AnsiballZ_systemd.py 18662 1726867332.81127: Sending initial data 18662 1726867332.81145: Sent initial data (156 bytes) 18662 1726867332.82253: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867332.82571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867332.82742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867332.82892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867332.84566: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867332.84617: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpd998fopf /root/.ansible/tmp/ansible-tmp-1726867332.7675567-19923-226594019579482/AnsiballZ_systemd.py <<< 18662 1726867332.84620: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867332.7675567-19923-226594019579482/AnsiballZ_systemd.py" <<< 18662 1726867332.85204: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpd998fopf" to remote "/root/.ansible/tmp/ansible-tmp-1726867332.7675567-19923-226594019579482/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867332.7675567-19923-226594019579482/AnsiballZ_systemd.py" <<< 18662 1726867332.88083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867332.88087: stdout chunk (state=3): >>><<< 18662 1726867332.88089: stderr chunk (state=3): >>><<< 18662 1726867332.88092: done transferring module to remote 18662 1726867332.88369: _low_level_execute_command(): starting 18662 1726867332.88374: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867332.7675567-19923-226594019579482/ /root/.ansible/tmp/ansible-tmp-1726867332.7675567-19923-226594019579482/AnsiballZ_systemd.py && sleep 0' 18662 1726867332.89690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867332.89796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867332.89801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867332.89818: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867332.89821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867332.90098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867332.90101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867332.90153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867332.92033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867332.92037: stdout chunk (state=3): >>><<< 18662 1726867332.92043: stderr chunk (state=3): >>><<< 18662 1726867332.92060: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867332.92063: _low_level_execute_command(): starting 18662 1726867332.92068: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867332.7675567-19923-226594019579482/AnsiballZ_systemd.py && sleep 0' 18662 1726867332.93301: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867332.93494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867332.93591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867333.22937: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4517888", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313639424", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "868675000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 18662 1726867333.22966: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-<<< 18662 1726867333.22979: stdout chunk (state=3): >>>broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 18662 1726867333.24841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867333.24861: stderr chunk (state=3): >>><<< 18662 1726867333.24864: stdout chunk (state=3): >>><<< 18662 1726867333.24884: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4517888", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313639424", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "868675000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867333.25013: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867332.7675567-19923-226594019579482/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867333.25027: _low_level_execute_command(): starting 18662 1726867333.25031: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867332.7675567-19923-226594019579482/ > /dev/null 2>&1 && sleep 0' 18662 1726867333.25536: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867333.25539: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 18662 1726867333.25544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867333.25602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867333.25656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867333.25714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867333.27552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867333.27573: stderr chunk (state=3): >>><<< 18662 1726867333.27576: stdout chunk (state=3): >>><<< 18662 1726867333.27590: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867333.27596: handler run complete 18662 1726867333.27656: attempt loop complete, returning result 18662 1726867333.27660: _execute() done 18662 1726867333.27662: dumping result to json 18662 1726867333.27681: done dumping result, returning 18662 1726867333.27699: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-efab-a8ce-000000000048] 18662 1726867333.27702: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000048 18662 1726867333.27946: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000048 18662 1726867333.27950: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18662 1726867333.28004: no more pending results, returning what we have 18662 1726867333.28007: results queue empty 18662 1726867333.28008: checking for any_errors_fatal 18662 1726867333.28014: done checking for any_errors_fatal 18662 1726867333.28015: checking for max_fail_percentage 18662 1726867333.28017: done checking for max_fail_percentage 18662 1726867333.28018: checking to see if all hosts have failed and the running result is not ok 18662 1726867333.28018: done checking to see if all hosts have failed 18662 1726867333.28019: getting the remaining hosts for this loop 18662 1726867333.28020: done getting the remaining hosts for this loop 18662 1726867333.28023: getting the next task for host managed_node2 18662 1726867333.28028: done getting next task for host managed_node2 18662 1726867333.28031: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18662 1726867333.28033: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867333.28043: getting variables 18662 1726867333.28044: in VariableManager get_vars() 18662 1726867333.28076: Calling all_inventory to load vars for managed_node2 18662 1726867333.28079: Calling groups_inventory to load vars for managed_node2 18662 1726867333.28081: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867333.28090: Calling all_plugins_play to load vars for managed_node2 18662 1726867333.28093: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867333.28095: Calling groups_plugins_play to load vars for managed_node2 18662 1726867333.29367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867333.30954: done with get_vars() 18662 1726867333.30974: done getting variables 18662 1726867333.31037: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:22:13 -0400 (0:00:00.827) 0:00:27.946 ****** 18662 1726867333.31080: entering _queue_task() for managed_node2/service 18662 1726867333.31439: worker is 1 (out of 1 available) 18662 1726867333.31467: exiting _queue_task() for managed_node2/service 18662 1726867333.31480: done queuing things up, now waiting for results queue to drain 18662 1726867333.31482: waiting for pending results... 18662 1726867333.31683: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18662 1726867333.31739: in run() - task 0affcac9-a3a5-efab-a8ce-000000000049 18662 1726867333.31749: variable 'ansible_search_path' from source: unknown 18662 1726867333.31753: variable 'ansible_search_path' from source: unknown 18662 1726867333.31791: calling self._execute() 18662 1726867333.31898: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867333.31902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867333.31923: variable 'omit' from source: magic vars 18662 1726867333.32287: variable 'ansible_distribution_major_version' from source: facts 18662 1726867333.32291: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867333.32396: variable 'network_provider' from source: set_fact 18662 1726867333.32400: Evaluated conditional (network_provider == "nm"): True 18662 1726867333.32484: variable '__network_wpa_supplicant_required' from source: role '' defaults 18662 1726867333.32558: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18662 1726867333.32689: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867333.34860: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867333.34895: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867333.34925: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867333.34954: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867333.34975: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867333.35044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867333.35071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867333.35090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867333.35119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867333.35131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867333.35179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867333.35198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867333.35219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867333.35243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867333.35255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867333.35285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867333.35305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867333.35324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867333.35347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867333.35357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867333.35484: variable 'network_connections' from source: play vars 18662 1726867333.35492: variable 'profile' from source: play vars 18662 1726867333.35557: variable 'profile' from source: play vars 18662 1726867333.35561: variable 'interface' from source: set_fact 18662 1726867333.35633: variable 'interface' from source: set_fact 18662 1726867333.35682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867333.35816: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867333.35850: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867333.35880: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867333.35903: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867333.35937: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867333.35951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867333.36003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867333.36025: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867333.36062: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867333.36271: variable 'network_connections' from source: play vars 18662 1726867333.36275: variable 'profile' from source: play vars 18662 1726867333.36317: variable 'profile' from source: play vars 18662 1726867333.36321: variable 'interface' from source: set_fact 18662 1726867333.36361: variable 'interface' from source: set_fact 18662 1726867333.36385: Evaluated conditional (__network_wpa_supplicant_required): False 18662 1726867333.36389: when evaluation is False, skipping this task 18662 1726867333.36391: _execute() done 18662 1726867333.36401: dumping result to json 18662 1726867333.36406: done dumping result, returning 18662 1726867333.36408: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-efab-a8ce-000000000049] 18662 1726867333.36410: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000049 18662 1726867333.36502: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000049 18662 1726867333.36505: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 18662 1726867333.36567: no more pending results, returning what we have 18662 1726867333.36570: results queue empty 18662 1726867333.36571: checking for any_errors_fatal 18662 1726867333.36601: done checking for any_errors_fatal 18662 1726867333.36602: checking for max_fail_percentage 18662 1726867333.36604: done checking for max_fail_percentage 18662 1726867333.36605: checking to see if all hosts have failed and the running result is not ok 18662 1726867333.36605: done checking to see if all hosts have failed 18662 1726867333.36606: getting the remaining hosts for this loop 18662 1726867333.36607: done getting the remaining hosts for this loop 18662 1726867333.36611: getting the next task for host managed_node2 18662 1726867333.36618: done getting next task for host managed_node2 18662 1726867333.36622: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18662 1726867333.36623: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867333.36637: getting variables 18662 1726867333.36640: in VariableManager get_vars() 18662 1726867333.36673: Calling all_inventory to load vars for managed_node2 18662 1726867333.36676: Calling groups_inventory to load vars for managed_node2 18662 1726867333.36679: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867333.36688: Calling all_plugins_play to load vars for managed_node2 18662 1726867333.36691: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867333.36693: Calling groups_plugins_play to load vars for managed_node2 18662 1726867333.37594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867333.38484: done with get_vars() 18662 1726867333.38511: done getting variables 18662 1726867333.38558: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:22:13 -0400 (0:00:00.075) 0:00:28.021 ****** 18662 1726867333.38580: entering _queue_task() for managed_node2/service 18662 1726867333.38812: worker is 1 (out of 1 available) 18662 1726867333.38825: exiting _queue_task() for managed_node2/service 18662 1726867333.38836: done queuing things up, now waiting for results queue to drain 18662 1726867333.38837: waiting for pending results... 18662 1726867333.39008: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 18662 1726867333.39072: in run() - task 0affcac9-a3a5-efab-a8ce-00000000004a 18662 1726867333.39088: variable 'ansible_search_path' from source: unknown 18662 1726867333.39091: variable 'ansible_search_path' from source: unknown 18662 1726867333.39119: calling self._execute() 18662 1726867333.39192: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867333.39196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867333.39204: variable 'omit' from source: magic vars 18662 1726867333.39586: variable 'ansible_distribution_major_version' from source: facts 18662 1726867333.39608: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867333.39745: variable 'network_provider' from source: set_fact 18662 1726867333.39749: Evaluated conditional (network_provider == "initscripts"): False 18662 1726867333.39751: when evaluation is False, skipping this task 18662 1726867333.39753: _execute() done 18662 1726867333.39756: dumping result to json 18662 1726867333.39760: done dumping result, returning 18662 1726867333.39767: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-efab-a8ce-00000000004a] 18662 1726867333.39770: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000004a skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18662 1726867333.39973: no more pending results, returning what we have 18662 1726867333.39980: results queue empty 18662 1726867333.39981: checking for any_errors_fatal 18662 1726867333.39987: done checking for any_errors_fatal 18662 1726867333.39989: checking for max_fail_percentage 18662 1726867333.39994: done checking for max_fail_percentage 18662 1726867333.39995: checking to see if all hosts have failed and the running result is not ok 18662 1726867333.39996: done checking to see if all hosts have failed 18662 1726867333.39997: getting the remaining hosts for this loop 18662 1726867333.39998: done getting the remaining hosts for this loop 18662 1726867333.40001: getting the next task for host managed_node2 18662 1726867333.40009: done getting next task for host managed_node2 18662 1726867333.40043: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18662 1726867333.40050: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867333.40109: getting variables 18662 1726867333.40111: in VariableManager get_vars() 18662 1726867333.40150: Calling all_inventory to load vars for managed_node2 18662 1726867333.40152: Calling groups_inventory to load vars for managed_node2 18662 1726867333.40154: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867333.40165: Calling all_plugins_play to load vars for managed_node2 18662 1726867333.40189: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867333.40212: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000004a 18662 1726867333.40215: WORKER PROCESS EXITING 18662 1726867333.40235: Calling groups_plugins_play to load vars for managed_node2 18662 1726867333.41554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867333.42937: done with get_vars() 18662 1726867333.42959: done getting variables 18662 1726867333.43034: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:22:13 -0400 (0:00:00.044) 0:00:28.066 ****** 18662 1726867333.43070: entering _queue_task() for managed_node2/copy 18662 1726867333.43359: worker is 1 (out of 1 available) 18662 1726867333.43371: exiting _queue_task() for managed_node2/copy 18662 1726867333.43384: done queuing things up, now waiting for results queue to drain 18662 1726867333.43385: waiting for pending results... 18662 1726867333.43567: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18662 1726867333.43646: in run() - task 0affcac9-a3a5-efab-a8ce-00000000004b 18662 1726867333.43656: variable 'ansible_search_path' from source: unknown 18662 1726867333.43660: variable 'ansible_search_path' from source: unknown 18662 1726867333.43690: calling self._execute() 18662 1726867333.43839: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867333.43843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867333.43845: variable 'omit' from source: magic vars 18662 1726867333.44389: variable 'ansible_distribution_major_version' from source: facts 18662 1726867333.44392: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867333.44395: variable 'network_provider' from source: set_fact 18662 1726867333.44397: Evaluated conditional (network_provider == "initscripts"): False 18662 1726867333.44400: when evaluation is False, skipping this task 18662 1726867333.44402: _execute() done 18662 1726867333.44403: dumping result to json 18662 1726867333.44405: done dumping result, returning 18662 1726867333.44408: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-efab-a8ce-00000000004b] 18662 1726867333.44410: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000004b 18662 1726867333.44475: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000004b 18662 1726867333.44492: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 18662 1726867333.44526: no more pending results, returning what we have 18662 1726867333.44528: results queue empty 18662 1726867333.44529: checking for any_errors_fatal 18662 1726867333.44533: done checking for any_errors_fatal 18662 1726867333.44534: checking for max_fail_percentage 18662 1726867333.44536: done checking for max_fail_percentage 18662 1726867333.44536: checking to see if all hosts have failed and the running result is not ok 18662 1726867333.44537: done checking to see if all hosts have failed 18662 1726867333.44538: getting the remaining hosts for this loop 18662 1726867333.44539: done getting the remaining hosts for this loop 18662 1726867333.44542: getting the next task for host managed_node2 18662 1726867333.44546: done getting next task for host managed_node2 18662 1726867333.44549: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18662 1726867333.44551: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867333.44562: getting variables 18662 1726867333.44563: in VariableManager get_vars() 18662 1726867333.44753: Calling all_inventory to load vars for managed_node2 18662 1726867333.44756: Calling groups_inventory to load vars for managed_node2 18662 1726867333.44759: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867333.44767: Calling all_plugins_play to load vars for managed_node2 18662 1726867333.44770: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867333.44773: Calling groups_plugins_play to load vars for managed_node2 18662 1726867333.46013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867333.47007: done with get_vars() 18662 1726867333.47024: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:22:13 -0400 (0:00:00.040) 0:00:28.106 ****** 18662 1726867333.47081: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 18662 1726867333.47279: worker is 1 (out of 1 available) 18662 1726867333.47292: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 18662 1726867333.47302: done queuing things up, now waiting for results queue to drain 18662 1726867333.47303: waiting for pending results... 18662 1726867333.47467: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18662 1726867333.47540: in run() - task 0affcac9-a3a5-efab-a8ce-00000000004c 18662 1726867333.47551: variable 'ansible_search_path' from source: unknown 18662 1726867333.47554: variable 'ansible_search_path' from source: unknown 18662 1726867333.47582: calling self._execute() 18662 1726867333.47655: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867333.47659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867333.47668: variable 'omit' from source: magic vars 18662 1726867333.47934: variable 'ansible_distribution_major_version' from source: facts 18662 1726867333.47943: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867333.47948: variable 'omit' from source: magic vars 18662 1726867333.47978: variable 'omit' from source: magic vars 18662 1726867333.48088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867333.50582: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867333.50586: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867333.50635: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867333.50674: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867333.50722: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867333.50811: variable 'network_provider' from source: set_fact 18662 1726867333.50966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867333.51019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867333.51062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867333.51113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867333.51136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867333.51261: variable 'omit' from source: magic vars 18662 1726867333.51351: variable 'omit' from source: magic vars 18662 1726867333.51476: variable 'network_connections' from source: play vars 18662 1726867333.51498: variable 'profile' from source: play vars 18662 1726867333.51585: variable 'profile' from source: play vars 18662 1726867333.51594: variable 'interface' from source: set_fact 18662 1726867333.51651: variable 'interface' from source: set_fact 18662 1726867333.51914: variable 'omit' from source: magic vars 18662 1726867333.51919: variable '__lsr_ansible_managed' from source: task vars 18662 1726867333.51921: variable '__lsr_ansible_managed' from source: task vars 18662 1726867333.52199: Loaded config def from plugin (lookup/template) 18662 1726867333.52212: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 18662 1726867333.52255: File lookup term: get_ansible_managed.j2 18662 1726867333.52264: variable 'ansible_search_path' from source: unknown 18662 1726867333.52348: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 18662 1726867333.52354: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 18662 1726867333.52362: variable 'ansible_search_path' from source: unknown 18662 1726867333.56732: variable 'ansible_managed' from source: unknown 18662 1726867333.56812: variable 'omit' from source: magic vars 18662 1726867333.56833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867333.56858: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867333.56873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867333.56888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867333.56896: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867333.56927: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867333.56931: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867333.56933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867333.57000: Set connection var ansible_timeout to 10 18662 1726867333.57003: Set connection var ansible_connection to ssh 18662 1726867333.57008: Set connection var ansible_shell_executable to /bin/sh 18662 1726867333.57013: Set connection var ansible_shell_type to sh 18662 1726867333.57021: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867333.57026: Set connection var ansible_pipelining to False 18662 1726867333.57043: variable 'ansible_shell_executable' from source: unknown 18662 1726867333.57046: variable 'ansible_connection' from source: unknown 18662 1726867333.57048: variable 'ansible_module_compression' from source: unknown 18662 1726867333.57051: variable 'ansible_shell_type' from source: unknown 18662 1726867333.57053: variable 'ansible_shell_executable' from source: unknown 18662 1726867333.57055: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867333.57060: variable 'ansible_pipelining' from source: unknown 18662 1726867333.57062: variable 'ansible_timeout' from source: unknown 18662 1726867333.57066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867333.57156: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867333.57167: variable 'omit' from source: magic vars 18662 1726867333.57170: starting attempt loop 18662 1726867333.57173: running the handler 18662 1726867333.57193: _low_level_execute_command(): starting 18662 1726867333.57199: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867333.57666: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867333.57670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867333.57673: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867333.57675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867333.57720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867333.57726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867333.57727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867333.57775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867333.59454: stdout chunk (state=3): >>>/root <<< 18662 1726867333.59592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867333.59595: stdout chunk (state=3): >>><<< 18662 1726867333.59606: stderr chunk (state=3): >>><<< 18662 1726867333.59784: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867333.59787: _low_level_execute_command(): starting 18662 1726867333.59791: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867333.5962942-19973-227247395684663 `" && echo ansible-tmp-1726867333.5962942-19973-227247395684663="` echo /root/.ansible/tmp/ansible-tmp-1726867333.5962942-19973-227247395684663 `" ) && sleep 0' 18662 1726867333.60316: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867333.60340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867333.60407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867333.62312: stdout chunk (state=3): >>>ansible-tmp-1726867333.5962942-19973-227247395684663=/root/.ansible/tmp/ansible-tmp-1726867333.5962942-19973-227247395684663 <<< 18662 1726867333.62483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867333.62488: stdout chunk (state=3): >>><<< 18662 1726867333.62491: stderr chunk (state=3): >>><<< 18662 1726867333.62519: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867333.5962942-19973-227247395684663=/root/.ansible/tmp/ansible-tmp-1726867333.5962942-19973-227247395684663 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867333.62685: variable 'ansible_module_compression' from source: unknown 18662 1726867333.62688: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 18662 1726867333.62691: variable 'ansible_facts' from source: unknown 18662 1726867333.62776: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867333.5962942-19973-227247395684663/AnsiballZ_network_connections.py 18662 1726867333.63033: Sending initial data 18662 1726867333.63037: Sent initial data (168 bytes) 18662 1726867333.63617: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867333.63632: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867333.63684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867333.63705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867333.63792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867333.63811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867333.63827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867333.63850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867333.63921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867333.65484: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867333.65524: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867333.65588: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmp_67qxxut /root/.ansible/tmp/ansible-tmp-1726867333.5962942-19973-227247395684663/AnsiballZ_network_connections.py <<< 18662 1726867333.65591: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867333.5962942-19973-227247395684663/AnsiballZ_network_connections.py" <<< 18662 1726867333.65624: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmp_67qxxut" to remote "/root/.ansible/tmp/ansible-tmp-1726867333.5962942-19973-227247395684663/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867333.5962942-19973-227247395684663/AnsiballZ_network_connections.py" <<< 18662 1726867333.66733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867333.66876: stderr chunk (state=3): >>><<< 18662 1726867333.66881: stdout chunk (state=3): >>><<< 18662 1726867333.66883: done transferring module to remote 18662 1726867333.66885: _low_level_execute_command(): starting 18662 1726867333.66887: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867333.5962942-19973-227247395684663/ /root/.ansible/tmp/ansible-tmp-1726867333.5962942-19973-227247395684663/AnsiballZ_network_connections.py && sleep 0' 18662 1726867333.67550: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867333.67562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867333.67656: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867333.67689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867333.67715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867333.67790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867333.67807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867333.69588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867333.69900: stdout chunk (state=3): >>><<< 18662 1726867333.69903: stderr chunk (state=3): >>><<< 18662 1726867333.69906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867333.69912: _low_level_execute_command(): starting 18662 1726867333.69915: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867333.5962942-19973-227247395684663/AnsiballZ_network_connections.py && sleep 0' 18662 1726867333.71061: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867333.71069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867333.71154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867334.01124: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 18662 1726867334.03198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867334.03202: stderr chunk (state=3): >>><<< 18662 1726867334.03236: stdout chunk (state=3): >>><<< 18662 1726867334.03240: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867334.03266: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867333.5962942-19973-227247395684663/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867334.03375: _low_level_execute_command(): starting 18662 1726867334.03381: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867333.5962942-19973-227247395684663/ > /dev/null 2>&1 && sleep 0' 18662 1726867334.03849: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867334.04086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867334.04090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867334.04092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867334.04194: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867334.04257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867334.06158: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867334.06161: stdout chunk (state=3): >>><<< 18662 1726867334.06167: stderr chunk (state=3): >>><<< 18662 1726867334.06184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867334.06382: handler run complete 18662 1726867334.06385: attempt loop complete, returning result 18662 1726867334.06387: _execute() done 18662 1726867334.06388: dumping result to json 18662 1726867334.06390: done dumping result, returning 18662 1726867334.06392: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-efab-a8ce-00000000004c] 18662 1726867334.06393: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000004c 18662 1726867334.06470: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000004c 18662 1726867334.06473: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 18662 1726867334.06559: no more pending results, returning what we have 18662 1726867334.06562: results queue empty 18662 1726867334.06562: checking for any_errors_fatal 18662 1726867334.06567: done checking for any_errors_fatal 18662 1726867334.06568: checking for max_fail_percentage 18662 1726867334.06569: done checking for max_fail_percentage 18662 1726867334.06570: checking to see if all hosts have failed and the running result is not ok 18662 1726867334.06571: done checking to see if all hosts have failed 18662 1726867334.06571: getting the remaining hosts for this loop 18662 1726867334.06573: done getting the remaining hosts for this loop 18662 1726867334.06576: getting the next task for host managed_node2 18662 1726867334.06588: done getting next task for host managed_node2 18662 1726867334.06595: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18662 1726867334.06597: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867334.06606: getting variables 18662 1726867334.06607: in VariableManager get_vars() 18662 1726867334.06664: Calling all_inventory to load vars for managed_node2 18662 1726867334.06667: Calling groups_inventory to load vars for managed_node2 18662 1726867334.06670: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867334.06681: Calling all_plugins_play to load vars for managed_node2 18662 1726867334.06684: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867334.06693: Calling groups_plugins_play to load vars for managed_node2 18662 1726867334.09111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867334.13081: done with get_vars() 18662 1726867334.13287: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:22:14 -0400 (0:00:00.664) 0:00:28.770 ****** 18662 1726867334.13532: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 18662 1726867334.14571: worker is 1 (out of 1 available) 18662 1726867334.14585: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 18662 1726867334.14597: done queuing things up, now waiting for results queue to drain 18662 1726867334.14599: waiting for pending results... 18662 1726867334.15316: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 18662 1726867334.15322: in run() - task 0affcac9-a3a5-efab-a8ce-00000000004d 18662 1726867334.15325: variable 'ansible_search_path' from source: unknown 18662 1726867334.15328: variable 'ansible_search_path' from source: unknown 18662 1726867334.15331: calling self._execute() 18662 1726867334.15630: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867334.15635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867334.15646: variable 'omit' from source: magic vars 18662 1726867334.16626: variable 'ansible_distribution_major_version' from source: facts 18662 1726867334.16641: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867334.16782: variable 'network_state' from source: role '' defaults 18662 1726867334.16807: Evaluated conditional (network_state != {}): False 18662 1726867334.16818: when evaluation is False, skipping this task 18662 1726867334.16824: _execute() done 18662 1726867334.16831: dumping result to json 18662 1726867334.16837: done dumping result, returning 18662 1726867334.16927: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-efab-a8ce-00000000004d] 18662 1726867334.16931: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000004d 18662 1726867334.16999: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000004d 18662 1726867334.17002: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18662 1726867334.17065: no more pending results, returning what we have 18662 1726867334.17070: results queue empty 18662 1726867334.17071: checking for any_errors_fatal 18662 1726867334.17083: done checking for any_errors_fatal 18662 1726867334.17084: checking for max_fail_percentage 18662 1726867334.17085: done checking for max_fail_percentage 18662 1726867334.17087: checking to see if all hosts have failed and the running result is not ok 18662 1726867334.17087: done checking to see if all hosts have failed 18662 1726867334.17088: getting the remaining hosts for this loop 18662 1726867334.17089: done getting the remaining hosts for this loop 18662 1726867334.17093: getting the next task for host managed_node2 18662 1726867334.17100: done getting next task for host managed_node2 18662 1726867334.17104: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18662 1726867334.17107: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867334.17126: getting variables 18662 1726867334.17128: in VariableManager get_vars() 18662 1726867334.17168: Calling all_inventory to load vars for managed_node2 18662 1726867334.17171: Calling groups_inventory to load vars for managed_node2 18662 1726867334.17174: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867334.17296: Calling all_plugins_play to load vars for managed_node2 18662 1726867334.17301: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867334.17304: Calling groups_plugins_play to load vars for managed_node2 18662 1726867334.19403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867334.22737: done with get_vars() 18662 1726867334.22767: done getting variables 18662 1726867334.22842: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:22:14 -0400 (0:00:00.093) 0:00:28.864 ****** 18662 1726867334.22876: entering _queue_task() for managed_node2/debug 18662 1726867334.23301: worker is 1 (out of 1 available) 18662 1726867334.23318: exiting _queue_task() for managed_node2/debug 18662 1726867334.23330: done queuing things up, now waiting for results queue to drain 18662 1726867334.23331: waiting for pending results... 18662 1726867334.23698: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18662 1726867334.23740: in run() - task 0affcac9-a3a5-efab-a8ce-00000000004e 18662 1726867334.23762: variable 'ansible_search_path' from source: unknown 18662 1726867334.23770: variable 'ansible_search_path' from source: unknown 18662 1726867334.23826: calling self._execute() 18662 1726867334.23937: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867334.24007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867334.24016: variable 'omit' from source: magic vars 18662 1726867334.24551: variable 'ansible_distribution_major_version' from source: facts 18662 1726867334.24555: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867334.24558: variable 'omit' from source: magic vars 18662 1726867334.24560: variable 'omit' from source: magic vars 18662 1726867334.24696: variable 'omit' from source: magic vars 18662 1726867334.24742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867334.24903: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867334.24933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867334.24956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867334.24973: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867334.25045: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867334.25092: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867334.25101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867334.25507: Set connection var ansible_timeout to 10 18662 1726867334.25513: Set connection var ansible_connection to ssh 18662 1726867334.25516: Set connection var ansible_shell_executable to /bin/sh 18662 1726867334.25518: Set connection var ansible_shell_type to sh 18662 1726867334.25520: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867334.25522: Set connection var ansible_pipelining to False 18662 1726867334.25524: variable 'ansible_shell_executable' from source: unknown 18662 1726867334.25526: variable 'ansible_connection' from source: unknown 18662 1726867334.25529: variable 'ansible_module_compression' from source: unknown 18662 1726867334.25531: variable 'ansible_shell_type' from source: unknown 18662 1726867334.25533: variable 'ansible_shell_executable' from source: unknown 18662 1726867334.25535: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867334.25537: variable 'ansible_pipelining' from source: unknown 18662 1726867334.25543: variable 'ansible_timeout' from source: unknown 18662 1726867334.25552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867334.25818: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867334.25894: variable 'omit' from source: magic vars 18662 1726867334.25902: starting attempt loop 18662 1726867334.25951: running the handler 18662 1726867334.26201: variable '__network_connections_result' from source: set_fact 18662 1726867334.26347: handler run complete 18662 1726867334.26362: attempt loop complete, returning result 18662 1726867334.26369: _execute() done 18662 1726867334.26385: dumping result to json 18662 1726867334.26418: done dumping result, returning 18662 1726867334.26431: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-efab-a8ce-00000000004e] 18662 1726867334.26496: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000004e ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 18662 1726867334.26744: no more pending results, returning what we have 18662 1726867334.26748: results queue empty 18662 1726867334.26749: checking for any_errors_fatal 18662 1726867334.26755: done checking for any_errors_fatal 18662 1726867334.26756: checking for max_fail_percentage 18662 1726867334.26757: done checking for max_fail_percentage 18662 1726867334.26759: checking to see if all hosts have failed and the running result is not ok 18662 1726867334.26759: done checking to see if all hosts have failed 18662 1726867334.26760: getting the remaining hosts for this loop 18662 1726867334.26761: done getting the remaining hosts for this loop 18662 1726867334.26765: getting the next task for host managed_node2 18662 1726867334.26772: done getting next task for host managed_node2 18662 1726867334.26776: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18662 1726867334.26780: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867334.26790: getting variables 18662 1726867334.26793: in VariableManager get_vars() 18662 1726867334.26835: Calling all_inventory to load vars for managed_node2 18662 1726867334.26837: Calling groups_inventory to load vars for managed_node2 18662 1726867334.26840: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867334.26851: Calling all_plugins_play to load vars for managed_node2 18662 1726867334.26856: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867334.26859: Calling groups_plugins_play to load vars for managed_node2 18662 1726867334.27431: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000004e 18662 1726867334.27435: WORKER PROCESS EXITING 18662 1726867334.29849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867334.32669: done with get_vars() 18662 1726867334.32698: done getting variables 18662 1726867334.32762: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:22:14 -0400 (0:00:00.100) 0:00:28.964 ****** 18662 1726867334.32912: entering _queue_task() for managed_node2/debug 18662 1726867334.33644: worker is 1 (out of 1 available) 18662 1726867334.33656: exiting _queue_task() for managed_node2/debug 18662 1726867334.33668: done queuing things up, now waiting for results queue to drain 18662 1726867334.33669: waiting for pending results... 18662 1726867334.34122: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18662 1726867334.34319: in run() - task 0affcac9-a3a5-efab-a8ce-00000000004f 18662 1726867334.34333: variable 'ansible_search_path' from source: unknown 18662 1726867334.34337: variable 'ansible_search_path' from source: unknown 18662 1726867334.34370: calling self._execute() 18662 1726867334.34684: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867334.34688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867334.34696: variable 'omit' from source: magic vars 18662 1726867334.35351: variable 'ansible_distribution_major_version' from source: facts 18662 1726867334.35485: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867334.35493: variable 'omit' from source: magic vars 18662 1726867334.35531: variable 'omit' from source: magic vars 18662 1726867334.35571: variable 'omit' from source: magic vars 18662 1726867334.35723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867334.35757: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867334.35783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867334.35901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867334.35915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867334.35944: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867334.35948: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867334.35951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867334.36182: Set connection var ansible_timeout to 10 18662 1726867334.36185: Set connection var ansible_connection to ssh 18662 1726867334.36265: Set connection var ansible_shell_executable to /bin/sh 18662 1726867334.36267: Set connection var ansible_shell_type to sh 18662 1726867334.36291: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867334.36294: Set connection var ansible_pipelining to False 18662 1726867334.36347: variable 'ansible_shell_executable' from source: unknown 18662 1726867334.36350: variable 'ansible_connection' from source: unknown 18662 1726867334.36353: variable 'ansible_module_compression' from source: unknown 18662 1726867334.36355: variable 'ansible_shell_type' from source: unknown 18662 1726867334.36357: variable 'ansible_shell_executable' from source: unknown 18662 1726867334.36359: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867334.36361: variable 'ansible_pipelining' from source: unknown 18662 1726867334.36363: variable 'ansible_timeout' from source: unknown 18662 1726867334.36365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867334.36845: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867334.36891: variable 'omit' from source: magic vars 18662 1726867334.36897: starting attempt loop 18662 1726867334.36900: running the handler 18662 1726867334.36948: variable '__network_connections_result' from source: set_fact 18662 1726867334.37023: variable '__network_connections_result' from source: set_fact 18662 1726867334.37326: handler run complete 18662 1726867334.37355: attempt loop complete, returning result 18662 1726867334.37358: _execute() done 18662 1726867334.37361: dumping result to json 18662 1726867334.37363: done dumping result, returning 18662 1726867334.37366: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-efab-a8ce-00000000004f] 18662 1726867334.37368: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000004f ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 18662 1726867334.37755: no more pending results, returning what we have 18662 1726867334.37758: results queue empty 18662 1726867334.37759: checking for any_errors_fatal 18662 1726867334.37765: done checking for any_errors_fatal 18662 1726867334.37765: checking for max_fail_percentage 18662 1726867334.37767: done checking for max_fail_percentage 18662 1726867334.37767: checking to see if all hosts have failed and the running result is not ok 18662 1726867334.37768: done checking to see if all hosts have failed 18662 1726867334.37769: getting the remaining hosts for this loop 18662 1726867334.37770: done getting the remaining hosts for this loop 18662 1726867334.37773: getting the next task for host managed_node2 18662 1726867334.37781: done getting next task for host managed_node2 18662 1726867334.37785: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18662 1726867334.37786: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867334.37795: getting variables 18662 1726867334.37797: in VariableManager get_vars() 18662 1726867334.37829: Calling all_inventory to load vars for managed_node2 18662 1726867334.37832: Calling groups_inventory to load vars for managed_node2 18662 1726867334.37834: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867334.37842: Calling all_plugins_play to load vars for managed_node2 18662 1726867334.37844: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867334.37847: Calling groups_plugins_play to load vars for managed_node2 18662 1726867334.38503: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000004f 18662 1726867334.38506: WORKER PROCESS EXITING 18662 1726867334.41731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867334.43782: done with get_vars() 18662 1726867334.43807: done getting variables 18662 1726867334.43882: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:22:14 -0400 (0:00:00.110) 0:00:29.074 ****** 18662 1726867334.43919: entering _queue_task() for managed_node2/debug 18662 1726867334.44627: worker is 1 (out of 1 available) 18662 1726867334.44639: exiting _queue_task() for managed_node2/debug 18662 1726867334.44650: done queuing things up, now waiting for results queue to drain 18662 1726867334.44652: waiting for pending results... 18662 1726867334.45201: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18662 1726867334.45206: in run() - task 0affcac9-a3a5-efab-a8ce-000000000050 18662 1726867334.45213: variable 'ansible_search_path' from source: unknown 18662 1726867334.45215: variable 'ansible_search_path' from source: unknown 18662 1726867334.45426: calling self._execute() 18662 1726867334.45712: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867334.45730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867334.45746: variable 'omit' from source: magic vars 18662 1726867334.46490: variable 'ansible_distribution_major_version' from source: facts 18662 1726867334.46507: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867334.46657: variable 'network_state' from source: role '' defaults 18662 1726867334.46671: Evaluated conditional (network_state != {}): False 18662 1726867334.46681: when evaluation is False, skipping this task 18662 1726867334.46695: _execute() done 18662 1726867334.46702: dumping result to json 18662 1726867334.46712: done dumping result, returning 18662 1726867334.46724: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-efab-a8ce-000000000050] 18662 1726867334.46734: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000050 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 18662 1726867334.46959: no more pending results, returning what we have 18662 1726867334.46964: results queue empty 18662 1726867334.46965: checking for any_errors_fatal 18662 1726867334.46975: done checking for any_errors_fatal 18662 1726867334.46976: checking for max_fail_percentage 18662 1726867334.46979: done checking for max_fail_percentage 18662 1726867334.46981: checking to see if all hosts have failed and the running result is not ok 18662 1726867334.46981: done checking to see if all hosts have failed 18662 1726867334.46982: getting the remaining hosts for this loop 18662 1726867334.46984: done getting the remaining hosts for this loop 18662 1726867334.46988: getting the next task for host managed_node2 18662 1726867334.46996: done getting next task for host managed_node2 18662 1726867334.47000: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18662 1726867334.47004: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867334.47025: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000050 18662 1726867334.47028: WORKER PROCESS EXITING 18662 1726867334.47037: getting variables 18662 1726867334.47039: in VariableManager get_vars() 18662 1726867334.47126: Calling all_inventory to load vars for managed_node2 18662 1726867334.47129: Calling groups_inventory to load vars for managed_node2 18662 1726867334.47132: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867334.47145: Calling all_plugins_play to load vars for managed_node2 18662 1726867334.47149: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867334.47151: Calling groups_plugins_play to load vars for managed_node2 18662 1726867334.49232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867334.51684: done with get_vars() 18662 1726867334.51706: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:22:14 -0400 (0:00:00.078) 0:00:29.153 ****** 18662 1726867334.51814: entering _queue_task() for managed_node2/ping 18662 1726867334.52154: worker is 1 (out of 1 available) 18662 1726867334.52166: exiting _queue_task() for managed_node2/ping 18662 1726867334.52180: done queuing things up, now waiting for results queue to drain 18662 1726867334.52182: waiting for pending results... 18662 1726867334.52899: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 18662 1726867334.53053: in run() - task 0affcac9-a3a5-efab-a8ce-000000000051 18662 1726867334.53075: variable 'ansible_search_path' from source: unknown 18662 1726867334.53110: variable 'ansible_search_path' from source: unknown 18662 1726867334.53319: calling self._execute() 18662 1726867334.53453: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867334.53471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867334.53489: variable 'omit' from source: magic vars 18662 1726867334.54230: variable 'ansible_distribution_major_version' from source: facts 18662 1726867334.54254: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867334.54266: variable 'omit' from source: magic vars 18662 1726867334.54315: variable 'omit' from source: magic vars 18662 1726867334.54366: variable 'omit' from source: magic vars 18662 1726867334.54415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867334.54465: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867334.54496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867334.54524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867334.54547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867334.54590: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867334.54657: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867334.54660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867334.54727: Set connection var ansible_timeout to 10 18662 1726867334.54736: Set connection var ansible_connection to ssh 18662 1726867334.54747: Set connection var ansible_shell_executable to /bin/sh 18662 1726867334.54755: Set connection var ansible_shell_type to sh 18662 1726867334.54778: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867334.54789: Set connection var ansible_pipelining to False 18662 1726867334.54824: variable 'ansible_shell_executable' from source: unknown 18662 1726867334.54833: variable 'ansible_connection' from source: unknown 18662 1726867334.54840: variable 'ansible_module_compression' from source: unknown 18662 1726867334.54847: variable 'ansible_shell_type' from source: unknown 18662 1726867334.54854: variable 'ansible_shell_executable' from source: unknown 18662 1726867334.54874: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867334.54879: variable 'ansible_pipelining' from source: unknown 18662 1726867334.54913: variable 'ansible_timeout' from source: unknown 18662 1726867334.54916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867334.55143: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867334.55203: variable 'omit' from source: magic vars 18662 1726867334.55206: starting attempt loop 18662 1726867334.55212: running the handler 18662 1726867334.55214: _low_level_execute_command(): starting 18662 1726867334.55216: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867334.55992: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867334.56011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867334.56093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867334.56154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867334.56169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867334.56265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867334.57941: stdout chunk (state=3): >>>/root <<< 18662 1726867334.58074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867334.58188: stdout chunk (state=3): >>><<< 18662 1726867334.58191: stderr chunk (state=3): >>><<< 18662 1726867334.58194: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867334.58197: _low_level_execute_command(): starting 18662 1726867334.58201: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867334.5812378-20021-135360222857504 `" && echo ansible-tmp-1726867334.5812378-20021-135360222857504="` echo /root/.ansible/tmp/ansible-tmp-1726867334.5812378-20021-135360222857504 `" ) && sleep 0' 18662 1726867334.58903: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867334.59182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867334.59408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867334.61178: stdout chunk (state=3): >>>ansible-tmp-1726867334.5812378-20021-135360222857504=/root/.ansible/tmp/ansible-tmp-1726867334.5812378-20021-135360222857504 <<< 18662 1726867334.61484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867334.61487: stdout chunk (state=3): >>><<< 18662 1726867334.61489: stderr chunk (state=3): >>><<< 18662 1726867334.61492: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867334.5812378-20021-135360222857504=/root/.ansible/tmp/ansible-tmp-1726867334.5812378-20021-135360222857504 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867334.61494: variable 'ansible_module_compression' from source: unknown 18662 1726867334.61496: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 18662 1726867334.61498: variable 'ansible_facts' from source: unknown 18662 1726867334.61550: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867334.5812378-20021-135360222857504/AnsiballZ_ping.py 18662 1726867334.61768: Sending initial data 18662 1726867334.61772: Sent initial data (153 bytes) 18662 1726867334.62339: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867334.62393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867334.62473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867334.62506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867334.62526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867334.62546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867334.62726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867334.64264: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867334.64304: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867334.64353: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmp8kdxhmya /root/.ansible/tmp/ansible-tmp-1726867334.5812378-20021-135360222857504/AnsiballZ_ping.py <<< 18662 1726867334.64356: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867334.5812378-20021-135360222857504/AnsiballZ_ping.py" <<< 18662 1726867334.64387: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmp8kdxhmya" to remote "/root/.ansible/tmp/ansible-tmp-1726867334.5812378-20021-135360222857504/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867334.5812378-20021-135360222857504/AnsiballZ_ping.py" <<< 18662 1726867334.65682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867334.65685: stdout chunk (state=3): >>><<< 18662 1726867334.65727: stderr chunk (state=3): >>><<< 18662 1726867334.65743: done transferring module to remote 18662 1726867334.65754: _low_level_execute_command(): starting 18662 1726867334.65760: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867334.5812378-20021-135360222857504/ /root/.ansible/tmp/ansible-tmp-1726867334.5812378-20021-135360222857504/AnsiballZ_ping.py && sleep 0' 18662 1726867334.66862: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867334.67043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867334.68749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867334.68910: stderr chunk (state=3): >>><<< 18662 1726867334.68914: stdout chunk (state=3): >>><<< 18662 1726867334.68917: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867334.68923: _low_level_execute_command(): starting 18662 1726867334.68926: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867334.5812378-20021-135360222857504/AnsiballZ_ping.py && sleep 0' 18662 1726867334.69899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867334.70181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867334.70189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867334.70395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867334.85318: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 18662 1726867334.86747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867334.86752: stdout chunk (state=3): >>><<< 18662 1726867334.86785: stderr chunk (state=3): >>><<< 18662 1726867334.86788: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867334.86791: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867334.5812378-20021-135360222857504/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867334.86829: _low_level_execute_command(): starting 18662 1726867334.86832: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867334.5812378-20021-135360222857504/ > /dev/null 2>&1 && sleep 0' 18662 1726867334.87865: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867334.87943: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867334.87954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867334.88041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867334.88044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867334.88046: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867334.88049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867334.88068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18662 1726867334.88080: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 18662 1726867334.88196: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867334.88406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867334.88461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867334.90372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867334.90376: stdout chunk (state=3): >>><<< 18662 1726867334.90380: stderr chunk (state=3): >>><<< 18662 1726867334.90390: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867334.90412: handler run complete 18662 1726867334.90416: attempt loop complete, returning result 18662 1726867334.90418: _execute() done 18662 1726867334.90420: dumping result to json 18662 1726867334.90423: done dumping result, returning 18662 1726867334.90427: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-efab-a8ce-000000000051] 18662 1726867334.90433: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000051 18662 1726867334.90807: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000051 18662 1726867334.90812: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 18662 1726867334.90871: no more pending results, returning what we have 18662 1726867334.90874: results queue empty 18662 1726867334.90875: checking for any_errors_fatal 18662 1726867334.90882: done checking for any_errors_fatal 18662 1726867334.90883: checking for max_fail_percentage 18662 1726867334.90885: done checking for max_fail_percentage 18662 1726867334.90886: checking to see if all hosts have failed and the running result is not ok 18662 1726867334.90886: done checking to see if all hosts have failed 18662 1726867334.90887: getting the remaining hosts for this loop 18662 1726867334.90888: done getting the remaining hosts for this loop 18662 1726867334.90892: getting the next task for host managed_node2 18662 1726867334.90900: done getting next task for host managed_node2 18662 1726867334.90901: ^ task is: TASK: meta (role_complete) 18662 1726867334.90990: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867334.91002: getting variables 18662 1726867334.91004: in VariableManager get_vars() 18662 1726867334.91060: Calling all_inventory to load vars for managed_node2 18662 1726867334.91063: Calling groups_inventory to load vars for managed_node2 18662 1726867334.91065: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867334.91133: Calling all_plugins_play to load vars for managed_node2 18662 1726867334.91137: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867334.91140: Calling groups_plugins_play to load vars for managed_node2 18662 1726867334.92667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867334.94439: done with get_vars() 18662 1726867334.94462: done getting variables 18662 1726867334.94552: done queuing things up, now waiting for results queue to drain 18662 1726867334.94554: results queue empty 18662 1726867334.94554: checking for any_errors_fatal 18662 1726867334.94557: done checking for any_errors_fatal 18662 1726867334.94558: checking for max_fail_percentage 18662 1726867334.94559: done checking for max_fail_percentage 18662 1726867334.94559: checking to see if all hosts have failed and the running result is not ok 18662 1726867334.94560: done checking to see if all hosts have failed 18662 1726867334.94560: getting the remaining hosts for this loop 18662 1726867334.94561: done getting the remaining hosts for this loop 18662 1726867334.94564: getting the next task for host managed_node2 18662 1726867334.94567: done getting next task for host managed_node2 18662 1726867334.94568: ^ task is: TASK: meta (flush_handlers) 18662 1726867334.94570: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867334.94573: getting variables 18662 1726867334.94573: in VariableManager get_vars() 18662 1726867334.94587: Calling all_inventory to load vars for managed_node2 18662 1726867334.94589: Calling groups_inventory to load vars for managed_node2 18662 1726867334.94591: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867334.94606: Calling all_plugins_play to load vars for managed_node2 18662 1726867334.94611: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867334.94615: Calling groups_plugins_play to load vars for managed_node2 18662 1726867334.96170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867334.97762: done with get_vars() 18662 1726867334.97784: done getting variables 18662 1726867334.97832: in VariableManager get_vars() 18662 1726867334.97844: Calling all_inventory to load vars for managed_node2 18662 1726867334.97847: Calling groups_inventory to load vars for managed_node2 18662 1726867334.97849: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867334.97853: Calling all_plugins_play to load vars for managed_node2 18662 1726867334.97856: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867334.97858: Calling groups_plugins_play to load vars for managed_node2 18662 1726867334.98969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867335.00475: done with get_vars() 18662 1726867335.00503: done queuing things up, now waiting for results queue to drain 18662 1726867335.00506: results queue empty 18662 1726867335.00506: checking for any_errors_fatal 18662 1726867335.00508: done checking for any_errors_fatal 18662 1726867335.00509: checking for max_fail_percentage 18662 1726867335.00510: done checking for max_fail_percentage 18662 1726867335.00511: checking to see if all hosts have failed and the running result is not ok 18662 1726867335.00511: done checking to see if all hosts have failed 18662 1726867335.00512: getting the remaining hosts for this loop 18662 1726867335.00513: done getting the remaining hosts for this loop 18662 1726867335.00516: getting the next task for host managed_node2 18662 1726867335.00520: done getting next task for host managed_node2 18662 1726867335.00522: ^ task is: TASK: meta (flush_handlers) 18662 1726867335.00523: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867335.00526: getting variables 18662 1726867335.00527: in VariableManager get_vars() 18662 1726867335.00538: Calling all_inventory to load vars for managed_node2 18662 1726867335.00540: Calling groups_inventory to load vars for managed_node2 18662 1726867335.00541: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867335.00545: Calling all_plugins_play to load vars for managed_node2 18662 1726867335.00547: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867335.00549: Calling groups_plugins_play to load vars for managed_node2 18662 1726867335.01687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867335.03173: done with get_vars() 18662 1726867335.03194: done getting variables 18662 1726867335.03243: in VariableManager get_vars() 18662 1726867335.03255: Calling all_inventory to load vars for managed_node2 18662 1726867335.03258: Calling groups_inventory to load vars for managed_node2 18662 1726867335.03260: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867335.03264: Calling all_plugins_play to load vars for managed_node2 18662 1726867335.03267: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867335.03269: Calling groups_plugins_play to load vars for managed_node2 18662 1726867335.04360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867335.05887: done with get_vars() 18662 1726867335.05912: done queuing things up, now waiting for results queue to drain 18662 1726867335.05914: results queue empty 18662 1726867335.05915: checking for any_errors_fatal 18662 1726867335.05916: done checking for any_errors_fatal 18662 1726867335.05917: checking for max_fail_percentage 18662 1726867335.05918: done checking for max_fail_percentage 18662 1726867335.05918: checking to see if all hosts have failed and the running result is not ok 18662 1726867335.05924: done checking to see if all hosts have failed 18662 1726867335.05924: getting the remaining hosts for this loop 18662 1726867335.05925: done getting the remaining hosts for this loop 18662 1726867335.05928: getting the next task for host managed_node2 18662 1726867335.05933: done getting next task for host managed_node2 18662 1726867335.05934: ^ task is: None 18662 1726867335.05935: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867335.05936: done queuing things up, now waiting for results queue to drain 18662 1726867335.05937: results queue empty 18662 1726867335.05938: checking for any_errors_fatal 18662 1726867335.05938: done checking for any_errors_fatal 18662 1726867335.05939: checking for max_fail_percentage 18662 1726867335.05940: done checking for max_fail_percentage 18662 1726867335.05941: checking to see if all hosts have failed and the running result is not ok 18662 1726867335.05941: done checking to see if all hosts have failed 18662 1726867335.05942: getting the next task for host managed_node2 18662 1726867335.05945: done getting next task for host managed_node2 18662 1726867335.05946: ^ task is: None 18662 1726867335.05947: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867335.06194: in VariableManager get_vars() 18662 1726867335.06209: done with get_vars() 18662 1726867335.06215: in VariableManager get_vars() 18662 1726867335.06224: done with get_vars() 18662 1726867335.06229: variable 'omit' from source: magic vars 18662 1726867335.06260: in VariableManager get_vars() 18662 1726867335.06270: done with get_vars() 18662 1726867335.06297: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 18662 1726867335.06573: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18662 1726867335.06598: getting the remaining hosts for this loop 18662 1726867335.06600: done getting the remaining hosts for this loop 18662 1726867335.06602: getting the next task for host managed_node2 18662 1726867335.06605: done getting next task for host managed_node2 18662 1726867335.06607: ^ task is: TASK: Gathering Facts 18662 1726867335.06608: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867335.06610: getting variables 18662 1726867335.06611: in VariableManager get_vars() 18662 1726867335.06619: Calling all_inventory to load vars for managed_node2 18662 1726867335.06622: Calling groups_inventory to load vars for managed_node2 18662 1726867335.06624: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867335.06629: Calling all_plugins_play to load vars for managed_node2 18662 1726867335.06631: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867335.06634: Calling groups_plugins_play to load vars for managed_node2 18662 1726867335.08599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867335.10276: done with get_vars() 18662 1726867335.10297: done getting variables 18662 1726867335.10336: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 17:22:15 -0400 (0:00:00.585) 0:00:29.739 ****** 18662 1726867335.10359: entering _queue_task() for managed_node2/gather_facts 18662 1726867335.10915: worker is 1 (out of 1 available) 18662 1726867335.10925: exiting _queue_task() for managed_node2/gather_facts 18662 1726867335.10935: done queuing things up, now waiting for results queue to drain 18662 1726867335.10936: waiting for pending results... 18662 1726867335.11696: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18662 1726867335.11785: in run() - task 0affcac9-a3a5-efab-a8ce-0000000003f8 18662 1726867335.11791: variable 'ansible_search_path' from source: unknown 18662 1726867335.11795: calling self._execute() 18662 1726867335.12066: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867335.12174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867335.12179: variable 'omit' from source: magic vars 18662 1726867335.12485: variable 'ansible_distribution_major_version' from source: facts 18662 1726867335.12505: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867335.12514: variable 'omit' from source: magic vars 18662 1726867335.12544: variable 'omit' from source: magic vars 18662 1726867335.12581: variable 'omit' from source: magic vars 18662 1726867335.12628: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867335.12665: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867335.12694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867335.12719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867335.12735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867335.12766: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867335.12773: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867335.12784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867335.12906: Set connection var ansible_timeout to 10 18662 1726867335.12914: Set connection var ansible_connection to ssh 18662 1726867335.12925: Set connection var ansible_shell_executable to /bin/sh 18662 1726867335.12936: Set connection var ansible_shell_type to sh 18662 1726867335.12949: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867335.12958: Set connection var ansible_pipelining to False 18662 1726867335.12987: variable 'ansible_shell_executable' from source: unknown 18662 1726867335.12995: variable 'ansible_connection' from source: unknown 18662 1726867335.13003: variable 'ansible_module_compression' from source: unknown 18662 1726867335.13009: variable 'ansible_shell_type' from source: unknown 18662 1726867335.13015: variable 'ansible_shell_executable' from source: unknown 18662 1726867335.13020: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867335.13026: variable 'ansible_pipelining' from source: unknown 18662 1726867335.13031: variable 'ansible_timeout' from source: unknown 18662 1726867335.13041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867335.13224: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867335.13239: variable 'omit' from source: magic vars 18662 1726867335.13248: starting attempt loop 18662 1726867335.13260: running the handler 18662 1726867335.13281: variable 'ansible_facts' from source: unknown 18662 1726867335.13306: _low_level_execute_command(): starting 18662 1726867335.13319: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867335.14102: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867335.14126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867335.14150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867335.14183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867335.14383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867335.16192: stdout chunk (state=3): >>>/root <<< 18662 1726867335.16195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867335.16198: stdout chunk (state=3): >>><<< 18662 1726867335.16200: stderr chunk (state=3): >>><<< 18662 1726867335.16204: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867335.16218: _low_level_execute_command(): starting 18662 1726867335.16229: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867335.161833-20064-269158647739218 `" && echo ansible-tmp-1726867335.161833-20064-269158647739218="` echo /root/.ansible/tmp/ansible-tmp-1726867335.161833-20064-269158647739218 `" ) && sleep 0' 18662 1726867335.18163: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867335.18167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867335.18170: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867335.18182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 18662 1726867335.18186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867335.18558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867335.18664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867335.20592: stdout chunk (state=3): >>>ansible-tmp-1726867335.161833-20064-269158647739218=/root/.ansible/tmp/ansible-tmp-1726867335.161833-20064-269158647739218 <<< 18662 1726867335.20698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867335.20733: stderr chunk (state=3): >>><<< 18662 1726867335.20994: stdout chunk (state=3): >>><<< 18662 1726867335.20997: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867335.161833-20064-269158647739218=/root/.ansible/tmp/ansible-tmp-1726867335.161833-20064-269158647739218 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867335.21000: variable 'ansible_module_compression' from source: unknown 18662 1726867335.21001: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18662 1726867335.21063: variable 'ansible_facts' from source: unknown 18662 1726867335.21647: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867335.161833-20064-269158647739218/AnsiballZ_setup.py 18662 1726867335.21817: Sending initial data 18662 1726867335.21820: Sent initial data (153 bytes) 18662 1726867335.23217: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867335.23255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867335.23342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867335.23476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867335.23554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867335.23669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867335.23785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867335.24496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867335.25980: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867335.26126: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867335.26374: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpmq0uym6i /root/.ansible/tmp/ansible-tmp-1726867335.161833-20064-269158647739218/AnsiballZ_setup.py <<< 18662 1726867335.26380: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867335.161833-20064-269158647739218/AnsiballZ_setup.py" <<< 18662 1726867335.26401: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpmq0uym6i" to remote "/root/.ansible/tmp/ansible-tmp-1726867335.161833-20064-269158647739218/AnsiballZ_setup.py" <<< 18662 1726867335.26426: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867335.161833-20064-269158647739218/AnsiballZ_setup.py" <<< 18662 1726867335.29090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867335.29305: stderr chunk (state=3): >>><<< 18662 1726867335.29314: stdout chunk (state=3): >>><<< 18662 1726867335.29434: done transferring module to remote 18662 1726867335.29437: _low_level_execute_command(): starting 18662 1726867335.29440: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867335.161833-20064-269158647739218/ /root/.ansible/tmp/ansible-tmp-1726867335.161833-20064-269158647739218/AnsiballZ_setup.py && sleep 0' 18662 1726867335.31239: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867335.31262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867335.31266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867335.31341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867335.32090: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867335.32157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867335.34235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867335.34239: stdout chunk (state=3): >>><<< 18662 1726867335.34245: stderr chunk (state=3): >>><<< 18662 1726867335.34339: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867335.34343: _low_level_execute_command(): starting 18662 1726867335.34347: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867335.161833-20064-269158647739218/AnsiballZ_setup.py && sleep 0' 18662 1726867335.35364: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867335.35367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867335.35370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867335.35372: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867335.35375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867335.35493: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 18662 1726867335.35496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867335.35550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867335.35553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867335.35722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867335.35768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867336.02921: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel<<< 18662 1726867336.02988: stdout chunk (state=3): >>>", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2945, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 586, "free": 2945}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 573, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794844672, "block_size": 4096, "block_total": 65519099, "block_available": 63914757, "block_used": 1604342, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "22", "second": "15", "epoch": "1726867335", "epoch_int": "1726867335", "date": "2024-09-20", "time": "17:22:15", "iso8601_micro": "2024-09-20T21:22:15.966794Z", "iso8601": "2024-09-20T21:22:15Z", "iso8601_basic": "20240920T172215966794", "iso8601_basic_short": "20240920T172215", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.4091796875, "5m": 0.38525390625, "15m": 0.205078125}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo", "peerlsr27", "lsr27"], "ansible_lsr27": {"device": "lsr27", "macaddress": "32:21:19:c5:50:d2", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3021:19ff:fec5:50d2", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "e2:02:64:1b:da:9a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e002:64ff:fe1b:da9a", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::3021:19ff:fec5:50d2", "fe80::e002:64ff:fe1b:da9a", "fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad", "fe80::3021:19ff:fec5:50d2", "fe80::e002:64ff:fe1b:da9a"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18662 1726867336.05151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867336.05173: stdout chunk (state=3): >>><<< 18662 1726867336.05190: stderr chunk (state=3): >>><<< 18662 1726867336.05242: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2945, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 586, "free": 2945}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 573, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794844672, "block_size": 4096, "block_total": 65519099, "block_available": 63914757, "block_used": 1604342, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "22", "second": "15", "epoch": "1726867335", "epoch_int": "1726867335", "date": "2024-09-20", "time": "17:22:15", "iso8601_micro": "2024-09-20T21:22:15.966794Z", "iso8601": "2024-09-20T21:22:15Z", "iso8601_basic": "20240920T172215966794", "iso8601_basic_short": "20240920T172215", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.4091796875, "5m": 0.38525390625, "15m": 0.205078125}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo", "peerlsr27", "lsr27"], "ansible_lsr27": {"device": "lsr27", "macaddress": "32:21:19:c5:50:d2", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::3021:19ff:fec5:50d2", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "e2:02:64:1b:da:9a", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::e002:64ff:fe1b:da9a", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::3021:19ff:fec5:50d2", "fe80::e002:64ff:fe1b:da9a", "fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad", "fe80::3021:19ff:fec5:50d2", "fe80::e002:64ff:fe1b:da9a"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867336.05807: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867335.161833-20064-269158647739218/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867336.05814: _low_level_execute_command(): starting 18662 1726867336.05825: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867335.161833-20064-269158647739218/ > /dev/null 2>&1 && sleep 0' 18662 1726867336.06469: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867336.06525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867336.06529: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867336.06586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867336.06618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867336.06651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867336.06705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867336.08583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867336.08586: stdout chunk (state=3): >>><<< 18662 1726867336.08589: stderr chunk (state=3): >>><<< 18662 1726867336.08605: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867336.08784: handler run complete 18662 1726867336.08787: variable 'ansible_facts' from source: unknown 18662 1726867336.08887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867336.09279: variable 'ansible_facts' from source: unknown 18662 1726867336.09391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867336.09573: attempt loop complete, returning result 18662 1726867336.09587: _execute() done 18662 1726867336.09595: dumping result to json 18662 1726867336.09642: done dumping result, returning 18662 1726867336.09658: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-efab-a8ce-0000000003f8] 18662 1726867336.09672: sending task result for task 0affcac9-a3a5-efab-a8ce-0000000003f8 ok: [managed_node2] 18662 1726867336.10834: no more pending results, returning what we have 18662 1726867336.10838: results queue empty 18662 1726867336.10838: checking for any_errors_fatal 18662 1726867336.10840: done checking for any_errors_fatal 18662 1726867336.10840: checking for max_fail_percentage 18662 1726867336.10842: done checking for max_fail_percentage 18662 1726867336.10843: checking to see if all hosts have failed and the running result is not ok 18662 1726867336.10843: done checking to see if all hosts have failed 18662 1726867336.10844: getting the remaining hosts for this loop 18662 1726867336.10845: done getting the remaining hosts for this loop 18662 1726867336.10883: getting the next task for host managed_node2 18662 1726867336.10888: done getting next task for host managed_node2 18662 1726867336.10890: ^ task is: TASK: meta (flush_handlers) 18662 1726867336.10892: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867336.10896: getting variables 18662 1726867336.10897: in VariableManager get_vars() 18662 1726867336.10921: Calling all_inventory to load vars for managed_node2 18662 1726867336.10924: Calling groups_inventory to load vars for managed_node2 18662 1726867336.10927: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867336.10935: done sending task result for task 0affcac9-a3a5-efab-a8ce-0000000003f8 18662 1726867336.10937: WORKER PROCESS EXITING 18662 1726867336.10946: Calling all_plugins_play to load vars for managed_node2 18662 1726867336.10950: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867336.10953: Calling groups_plugins_play to load vars for managed_node2 18662 1726867336.13041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867336.14949: done with get_vars() 18662 1726867336.14972: done getting variables 18662 1726867336.15050: in VariableManager get_vars() 18662 1726867336.15060: Calling all_inventory to load vars for managed_node2 18662 1726867336.15062: Calling groups_inventory to load vars for managed_node2 18662 1726867336.15065: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867336.15070: Calling all_plugins_play to load vars for managed_node2 18662 1726867336.15072: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867336.15075: Calling groups_plugins_play to load vars for managed_node2 18662 1726867336.16241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867336.20125: done with get_vars() 18662 1726867336.20155: done queuing things up, now waiting for results queue to drain 18662 1726867336.20158: results queue empty 18662 1726867336.20159: checking for any_errors_fatal 18662 1726867336.20163: done checking for any_errors_fatal 18662 1726867336.20163: checking for max_fail_percentage 18662 1726867336.20165: done checking for max_fail_percentage 18662 1726867336.20169: checking to see if all hosts have failed and the running result is not ok 18662 1726867336.20170: done checking to see if all hosts have failed 18662 1726867336.20171: getting the remaining hosts for this loop 18662 1726867336.20171: done getting the remaining hosts for this loop 18662 1726867336.20174: getting the next task for host managed_node2 18662 1726867336.20381: done getting next task for host managed_node2 18662 1726867336.20386: ^ task is: TASK: Include the task 'delete_interface.yml' 18662 1726867336.20387: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867336.20390: getting variables 18662 1726867336.20391: in VariableManager get_vars() 18662 1726867336.20401: Calling all_inventory to load vars for managed_node2 18662 1726867336.20403: Calling groups_inventory to load vars for managed_node2 18662 1726867336.20406: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867336.20411: Calling all_plugins_play to load vars for managed_node2 18662 1726867336.20414: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867336.20417: Calling groups_plugins_play to load vars for managed_node2 18662 1726867336.28397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867336.31576: done with get_vars() 18662 1726867336.31605: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 17:22:16 -0400 (0:00:01.213) 0:00:30.954 ****** 18662 1726867336.31852: entering _queue_task() for managed_node2/include_tasks 18662 1726867336.32530: worker is 1 (out of 1 available) 18662 1726867336.32542: exiting _queue_task() for managed_node2/include_tasks 18662 1726867336.32555: done queuing things up, now waiting for results queue to drain 18662 1726867336.32556: waiting for pending results... 18662 1726867336.33297: running TaskExecutor() for managed_node2/TASK: Include the task 'delete_interface.yml' 18662 1726867336.33352: in run() - task 0affcac9-a3a5-efab-a8ce-000000000054 18662 1726867336.33372: variable 'ansible_search_path' from source: unknown 18662 1726867336.33447: calling self._execute() 18662 1726867336.33702: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867336.33722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867336.33753: variable 'omit' from source: magic vars 18662 1726867336.34947: variable 'ansible_distribution_major_version' from source: facts 18662 1726867336.34964: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867336.35469: _execute() done 18662 1726867336.35473: dumping result to json 18662 1726867336.35475: done dumping result, returning 18662 1726867336.35480: done running TaskExecutor() for managed_node2/TASK: Include the task 'delete_interface.yml' [0affcac9-a3a5-efab-a8ce-000000000054] 18662 1726867336.35482: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000054 18662 1726867336.35580: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000054 18662 1726867336.35585: WORKER PROCESS EXITING 18662 1726867336.35612: no more pending results, returning what we have 18662 1726867336.35618: in VariableManager get_vars() 18662 1726867336.35651: Calling all_inventory to load vars for managed_node2 18662 1726867336.35653: Calling groups_inventory to load vars for managed_node2 18662 1726867336.35656: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867336.35674: Calling all_plugins_play to load vars for managed_node2 18662 1726867336.35679: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867336.35681: Calling groups_plugins_play to load vars for managed_node2 18662 1726867336.38352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867336.42417: done with get_vars() 18662 1726867336.42435: variable 'ansible_search_path' from source: unknown 18662 1726867336.42450: we have included files to process 18662 1726867336.42452: generating all_blocks data 18662 1726867336.42453: done generating all_blocks data 18662 1726867336.42455: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 18662 1726867336.42456: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 18662 1726867336.42458: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 18662 1726867336.42994: done processing included file 18662 1726867336.42996: iterating over new_blocks loaded from include file 18662 1726867336.42998: in VariableManager get_vars() 18662 1726867336.43011: done with get_vars() 18662 1726867336.43013: filtering new block on tags 18662 1726867336.43030: done filtering new block on tags 18662 1726867336.43033: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 18662 1726867336.43038: extending task lists for all hosts with included blocks 18662 1726867336.43070: done extending task lists 18662 1726867336.43071: done processing included files 18662 1726867336.43072: results queue empty 18662 1726867336.43072: checking for any_errors_fatal 18662 1726867336.43074: done checking for any_errors_fatal 18662 1726867336.43075: checking for max_fail_percentage 18662 1726867336.43161: done checking for max_fail_percentage 18662 1726867336.43163: checking to see if all hosts have failed and the running result is not ok 18662 1726867336.43164: done checking to see if all hosts have failed 18662 1726867336.43165: getting the remaining hosts for this loop 18662 1726867336.43166: done getting the remaining hosts for this loop 18662 1726867336.43169: getting the next task for host managed_node2 18662 1726867336.43173: done getting next task for host managed_node2 18662 1726867336.43175: ^ task is: TASK: Remove test interface if necessary 18662 1726867336.43180: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867336.43182: getting variables 18662 1726867336.43183: in VariableManager get_vars() 18662 1726867336.43192: Calling all_inventory to load vars for managed_node2 18662 1726867336.43195: Calling groups_inventory to load vars for managed_node2 18662 1726867336.43197: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867336.43203: Calling all_plugins_play to load vars for managed_node2 18662 1726867336.43205: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867336.43208: Calling groups_plugins_play to load vars for managed_node2 18662 1726867336.44621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867336.46680: done with get_vars() 18662 1726867336.46698: done getting variables 18662 1726867336.46739: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 17:22:16 -0400 (0:00:00.149) 0:00:31.103 ****** 18662 1726867336.46773: entering _queue_task() for managed_node2/command 18662 1726867336.47218: worker is 1 (out of 1 available) 18662 1726867336.47230: exiting _queue_task() for managed_node2/command 18662 1726867336.47243: done queuing things up, now waiting for results queue to drain 18662 1726867336.47245: waiting for pending results... 18662 1726867336.47540: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 18662 1726867336.47649: in run() - task 0affcac9-a3a5-efab-a8ce-000000000409 18662 1726867336.47669: variable 'ansible_search_path' from source: unknown 18662 1726867336.47679: variable 'ansible_search_path' from source: unknown 18662 1726867336.47730: calling self._execute() 18662 1726867336.47819: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867336.47854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867336.47858: variable 'omit' from source: magic vars 18662 1726867336.48247: variable 'ansible_distribution_major_version' from source: facts 18662 1726867336.48273: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867336.48276: variable 'omit' from source: magic vars 18662 1726867336.48586: variable 'omit' from source: magic vars 18662 1726867336.48620: variable 'interface' from source: set_fact 18662 1726867336.48642: variable 'omit' from source: magic vars 18662 1726867336.48737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867336.48774: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867336.48831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867336.48902: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867336.48923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867336.49012: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867336.49022: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867336.49032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867336.49146: Set connection var ansible_timeout to 10 18662 1726867336.49153: Set connection var ansible_connection to ssh 18662 1726867336.49165: Set connection var ansible_shell_executable to /bin/sh 18662 1726867336.49175: Set connection var ansible_shell_type to sh 18662 1726867336.49245: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867336.49249: Set connection var ansible_pipelining to False 18662 1726867336.49251: variable 'ansible_shell_executable' from source: unknown 18662 1726867336.49254: variable 'ansible_connection' from source: unknown 18662 1726867336.49257: variable 'ansible_module_compression' from source: unknown 18662 1726867336.49259: variable 'ansible_shell_type' from source: unknown 18662 1726867336.49261: variable 'ansible_shell_executable' from source: unknown 18662 1726867336.49263: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867336.49264: variable 'ansible_pipelining' from source: unknown 18662 1726867336.49266: variable 'ansible_timeout' from source: unknown 18662 1726867336.49274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867336.49422: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867336.49438: variable 'omit' from source: magic vars 18662 1726867336.49448: starting attempt loop 18662 1726867336.49480: running the handler 18662 1726867336.49487: _low_level_execute_command(): starting 18662 1726867336.49499: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867336.50191: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867336.50206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867336.50262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867336.50329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867336.50344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867336.50385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867336.50465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867336.52288: stdout chunk (state=3): >>>/root <<< 18662 1726867336.52333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867336.52344: stdout chunk (state=3): >>><<< 18662 1726867336.52357: stderr chunk (state=3): >>><<< 18662 1726867336.52382: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867336.52603: _low_level_execute_command(): starting 18662 1726867336.52645: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867336.5256994-20138-138817354933963 `" && echo ansible-tmp-1726867336.5256994-20138-138817354933963="` echo /root/.ansible/tmp/ansible-tmp-1726867336.5256994-20138-138817354933963 `" ) && sleep 0' 18662 1726867336.53275: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867336.53289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867336.53301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867336.53349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867336.53436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867336.53439: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867336.53467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867336.53512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867336.55783: stdout chunk (state=3): >>>ansible-tmp-1726867336.5256994-20138-138817354933963=/root/.ansible/tmp/ansible-tmp-1726867336.5256994-20138-138817354933963 <<< 18662 1726867336.55787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867336.55790: stdout chunk (state=3): >>><<< 18662 1726867336.55793: stderr chunk (state=3): >>><<< 18662 1726867336.55796: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867336.5256994-20138-138817354933963=/root/.ansible/tmp/ansible-tmp-1726867336.5256994-20138-138817354933963 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867336.55799: variable 'ansible_module_compression' from source: unknown 18662 1726867336.55801: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18662 1726867336.55803: variable 'ansible_facts' from source: unknown 18662 1726867336.56000: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867336.5256994-20138-138817354933963/AnsiballZ_command.py 18662 1726867336.56221: Sending initial data 18662 1726867336.56294: Sent initial data (156 bytes) 18662 1726867336.57067: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867336.57086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867336.57099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867336.57117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867336.57194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867336.57223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867336.57248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867336.57312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867336.58866: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867336.58903: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867336.58947: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmp9v_3g7d7 /root/.ansible/tmp/ansible-tmp-1726867336.5256994-20138-138817354933963/AnsiballZ_command.py <<< 18662 1726867336.58952: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867336.5256994-20138-138817354933963/AnsiballZ_command.py" <<< 18662 1726867336.59011: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmp9v_3g7d7" to remote "/root/.ansible/tmp/ansible-tmp-1726867336.5256994-20138-138817354933963/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867336.5256994-20138-138817354933963/AnsiballZ_command.py" <<< 18662 1726867336.59937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867336.59993: stderr chunk (state=3): >>><<< 18662 1726867336.60005: stdout chunk (state=3): >>><<< 18662 1726867336.60129: done transferring module to remote 18662 1726867336.60132: _low_level_execute_command(): starting 18662 1726867336.60135: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867336.5256994-20138-138817354933963/ /root/.ansible/tmp/ansible-tmp-1726867336.5256994-20138-138817354933963/AnsiballZ_command.py && sleep 0' 18662 1726867336.60658: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867336.60672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867336.60691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867336.60712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867336.60738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867336.60751: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867336.60766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867336.60843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867336.60885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867336.60905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867336.60931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867336.61007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867336.62796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867336.62816: stdout chunk (state=3): >>><<< 18662 1726867336.62828: stderr chunk (state=3): >>><<< 18662 1726867336.62847: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867336.62866: _low_level_execute_command(): starting 18662 1726867336.62895: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867336.5256994-20138-138817354933963/AnsiballZ_command.py && sleep 0' 18662 1726867336.63650: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867336.63660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867336.63671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867336.63828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867336.63831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867336.63834: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867336.63836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867336.63838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18662 1726867336.63840: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 18662 1726867336.63842: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 18662 1726867336.63843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867336.63845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867336.63847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867336.63849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867336.63898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867336.63933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867336.80317: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-20 17:22:16.790597", "end": "2024-09-20 17:22:16.800247", "delta": "0:00:00.009650", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18662 1726867336.82587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867336.82591: stdout chunk (state=3): >>><<< 18662 1726867336.82754: stderr chunk (state=3): >>><<< 18662 1726867336.82759: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-20 17:22:16.790597", "end": "2024-09-20 17:22:16.800247", "delta": "0:00:00.009650", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867336.82762: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867336.5256994-20138-138817354933963/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867336.82764: _low_level_execute_command(): starting 18662 1726867336.82767: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867336.5256994-20138-138817354933963/ > /dev/null 2>&1 && sleep 0' 18662 1726867336.83363: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867336.83380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867336.83449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867336.83518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867336.83537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867336.83668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867336.83739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867336.85615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867336.85631: stdout chunk (state=3): >>><<< 18662 1726867336.85643: stderr chunk (state=3): >>><<< 18662 1726867336.85666: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867336.85688: handler run complete 18662 1726867336.85716: Evaluated conditional (False): False 18662 1726867336.85883: attempt loop complete, returning result 18662 1726867336.85887: _execute() done 18662 1726867336.85889: dumping result to json 18662 1726867336.85891: done dumping result, returning 18662 1726867336.85893: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [0affcac9-a3a5-efab-a8ce-000000000409] 18662 1726867336.85895: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000409 18662 1726867336.85964: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000409 18662 1726867336.85967: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr27" ], "delta": "0:00:00.009650", "end": "2024-09-20 17:22:16.800247", "rc": 0, "start": "2024-09-20 17:22:16.790597" } 18662 1726867336.86039: no more pending results, returning what we have 18662 1726867336.86044: results queue empty 18662 1726867336.86046: checking for any_errors_fatal 18662 1726867336.86047: done checking for any_errors_fatal 18662 1726867336.86048: checking for max_fail_percentage 18662 1726867336.86050: done checking for max_fail_percentage 18662 1726867336.86051: checking to see if all hosts have failed and the running result is not ok 18662 1726867336.86052: done checking to see if all hosts have failed 18662 1726867336.86053: getting the remaining hosts for this loop 18662 1726867336.86055: done getting the remaining hosts for this loop 18662 1726867336.86059: getting the next task for host managed_node2 18662 1726867336.86070: done getting next task for host managed_node2 18662 1726867336.86072: ^ task is: TASK: meta (flush_handlers) 18662 1726867336.86074: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867336.86081: getting variables 18662 1726867336.86084: in VariableManager get_vars() 18662 1726867336.86114: Calling all_inventory to load vars for managed_node2 18662 1726867336.86117: Calling groups_inventory to load vars for managed_node2 18662 1726867336.86120: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867336.86132: Calling all_plugins_play to load vars for managed_node2 18662 1726867336.86135: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867336.86137: Calling groups_plugins_play to load vars for managed_node2 18662 1726867336.87956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867336.90787: done with get_vars() 18662 1726867336.90811: done getting variables 18662 1726867336.90897: in VariableManager get_vars() 18662 1726867336.90907: Calling all_inventory to load vars for managed_node2 18662 1726867336.90909: Calling groups_inventory to load vars for managed_node2 18662 1726867336.90911: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867336.90917: Calling all_plugins_play to load vars for managed_node2 18662 1726867336.90919: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867336.90922: Calling groups_plugins_play to load vars for managed_node2 18662 1726867336.92184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867336.93865: done with get_vars() 18662 1726867336.93902: done queuing things up, now waiting for results queue to drain 18662 1726867336.93905: results queue empty 18662 1726867336.93906: checking for any_errors_fatal 18662 1726867336.93909: done checking for any_errors_fatal 18662 1726867336.93910: checking for max_fail_percentage 18662 1726867336.93911: done checking for max_fail_percentage 18662 1726867336.93912: checking to see if all hosts have failed and the running result is not ok 18662 1726867336.93913: done checking to see if all hosts have failed 18662 1726867336.93914: getting the remaining hosts for this loop 18662 1726867336.93915: done getting the remaining hosts for this loop 18662 1726867336.93917: getting the next task for host managed_node2 18662 1726867336.93921: done getting next task for host managed_node2 18662 1726867336.93923: ^ task is: TASK: meta (flush_handlers) 18662 1726867336.93924: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867336.93927: getting variables 18662 1726867336.93928: in VariableManager get_vars() 18662 1726867336.93937: Calling all_inventory to load vars for managed_node2 18662 1726867336.93939: Calling groups_inventory to load vars for managed_node2 18662 1726867336.93942: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867336.93947: Calling all_plugins_play to load vars for managed_node2 18662 1726867336.93949: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867336.93952: Calling groups_plugins_play to load vars for managed_node2 18662 1726867336.95224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867336.97044: done with get_vars() 18662 1726867336.97063: done getting variables 18662 1726867336.97122: in VariableManager get_vars() 18662 1726867336.97139: Calling all_inventory to load vars for managed_node2 18662 1726867336.97142: Calling groups_inventory to load vars for managed_node2 18662 1726867336.97144: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867336.97149: Calling all_plugins_play to load vars for managed_node2 18662 1726867336.97151: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867336.97154: Calling groups_plugins_play to load vars for managed_node2 18662 1726867336.98291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867336.99872: done with get_vars() 18662 1726867336.99899: done queuing things up, now waiting for results queue to drain 18662 1726867336.99902: results queue empty 18662 1726867336.99903: checking for any_errors_fatal 18662 1726867336.99904: done checking for any_errors_fatal 18662 1726867336.99905: checking for max_fail_percentage 18662 1726867336.99906: done checking for max_fail_percentage 18662 1726867336.99906: checking to see if all hosts have failed and the running result is not ok 18662 1726867336.99907: done checking to see if all hosts have failed 18662 1726867336.99911: getting the remaining hosts for this loop 18662 1726867336.99912: done getting the remaining hosts for this loop 18662 1726867336.99915: getting the next task for host managed_node2 18662 1726867336.99918: done getting next task for host managed_node2 18662 1726867336.99919: ^ task is: None 18662 1726867336.99920: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867336.99921: done queuing things up, now waiting for results queue to drain 18662 1726867336.99922: results queue empty 18662 1726867336.99923: checking for any_errors_fatal 18662 1726867336.99924: done checking for any_errors_fatal 18662 1726867336.99924: checking for max_fail_percentage 18662 1726867336.99925: done checking for max_fail_percentage 18662 1726867336.99926: checking to see if all hosts have failed and the running result is not ok 18662 1726867336.99926: done checking to see if all hosts have failed 18662 1726867336.99928: getting the next task for host managed_node2 18662 1726867336.99930: done getting next task for host managed_node2 18662 1726867336.99931: ^ task is: None 18662 1726867336.99932: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867336.99973: in VariableManager get_vars() 18662 1726867336.99998: done with get_vars() 18662 1726867337.00005: in VariableManager get_vars() 18662 1726867337.00021: done with get_vars() 18662 1726867337.00026: variable 'omit' from source: magic vars 18662 1726867337.00146: variable 'profile' from source: play vars 18662 1726867337.00240: in VariableManager get_vars() 18662 1726867337.00253: done with get_vars() 18662 1726867337.00271: variable 'omit' from source: magic vars 18662 1726867337.00336: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 18662 1726867337.00970: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18662 1726867337.00994: getting the remaining hosts for this loop 18662 1726867337.00995: done getting the remaining hosts for this loop 18662 1726867337.00998: getting the next task for host managed_node2 18662 1726867337.01000: done getting next task for host managed_node2 18662 1726867337.01003: ^ task is: TASK: Gathering Facts 18662 1726867337.01004: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867337.01006: getting variables 18662 1726867337.01007: in VariableManager get_vars() 18662 1726867337.01021: Calling all_inventory to load vars for managed_node2 18662 1726867337.01024: Calling groups_inventory to load vars for managed_node2 18662 1726867337.01026: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867337.01030: Calling all_plugins_play to load vars for managed_node2 18662 1726867337.01033: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867337.01035: Calling groups_plugins_play to load vars for managed_node2 18662 1726867337.02299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867337.03834: done with get_vars() 18662 1726867337.03856: done getting variables 18662 1726867337.03902: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 17:22:17 -0400 (0:00:00.571) 0:00:31.674 ****** 18662 1726867337.03931: entering _queue_task() for managed_node2/gather_facts 18662 1726867337.04263: worker is 1 (out of 1 available) 18662 1726867337.04273: exiting _queue_task() for managed_node2/gather_facts 18662 1726867337.04287: done queuing things up, now waiting for results queue to drain 18662 1726867337.04288: waiting for pending results... 18662 1726867337.04559: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18662 1726867337.04610: in run() - task 0affcac9-a3a5-efab-a8ce-000000000417 18662 1726867337.04632: variable 'ansible_search_path' from source: unknown 18662 1726867337.04684: calling self._execute() 18662 1726867337.04801: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867337.04813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867337.04879: variable 'omit' from source: magic vars 18662 1726867337.05587: variable 'ansible_distribution_major_version' from source: facts 18662 1726867337.05593: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867337.05595: variable 'omit' from source: magic vars 18662 1726867337.05604: variable 'omit' from source: magic vars 18662 1726867337.05607: variable 'omit' from source: magic vars 18662 1726867337.05609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867337.05612: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867337.05699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867337.05730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867337.05800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867337.05842: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867337.05892: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867337.05902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867337.06070: Set connection var ansible_timeout to 10 18662 1726867337.06283: Set connection var ansible_connection to ssh 18662 1726867337.06285: Set connection var ansible_shell_executable to /bin/sh 18662 1726867337.06287: Set connection var ansible_shell_type to sh 18662 1726867337.06289: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867337.06291: Set connection var ansible_pipelining to False 18662 1726867337.06293: variable 'ansible_shell_executable' from source: unknown 18662 1726867337.06295: variable 'ansible_connection' from source: unknown 18662 1726867337.06297: variable 'ansible_module_compression' from source: unknown 18662 1726867337.06299: variable 'ansible_shell_type' from source: unknown 18662 1726867337.06301: variable 'ansible_shell_executable' from source: unknown 18662 1726867337.06303: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867337.06305: variable 'ansible_pipelining' from source: unknown 18662 1726867337.06306: variable 'ansible_timeout' from source: unknown 18662 1726867337.06308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867337.06650: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867337.06678: variable 'omit' from source: magic vars 18662 1726867337.06720: starting attempt loop 18662 1726867337.06728: running the handler 18662 1726867337.06785: variable 'ansible_facts' from source: unknown 18662 1726867337.06818: _low_level_execute_command(): starting 18662 1726867337.06831: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867337.07709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867337.07882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867337.07891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867337.08015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867337.09802: stdout chunk (state=3): >>>/root <<< 18662 1726867337.09843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867337.09846: stdout chunk (state=3): >>><<< 18662 1726867337.09855: stderr chunk (state=3): >>><<< 18662 1726867337.09880: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867337.09897: _low_level_execute_command(): starting 18662 1726867337.09903: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867337.098784-20172-195810321460342 `" && echo ansible-tmp-1726867337.098784-20172-195810321460342="` echo /root/.ansible/tmp/ansible-tmp-1726867337.098784-20172-195810321460342 `" ) && sleep 0' 18662 1726867337.11038: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867337.11047: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867337.11192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867337.11384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867337.11417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867337.13401: stdout chunk (state=3): >>>ansible-tmp-1726867337.098784-20172-195810321460342=/root/.ansible/tmp/ansible-tmp-1726867337.098784-20172-195810321460342 <<< 18662 1726867337.13494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867337.13540: stderr chunk (state=3): >>><<< 18662 1726867337.13596: stdout chunk (state=3): >>><<< 18662 1726867337.13614: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867337.098784-20172-195810321460342=/root/.ansible/tmp/ansible-tmp-1726867337.098784-20172-195810321460342 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867337.13884: variable 'ansible_module_compression' from source: unknown 18662 1726867337.13888: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18662 1726867337.13890: variable 'ansible_facts' from source: unknown 18662 1726867337.14238: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867337.098784-20172-195810321460342/AnsiballZ_setup.py 18662 1726867337.14398: Sending initial data 18662 1726867337.14407: Sent initial data (153 bytes) 18662 1726867337.14976: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867337.14997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867337.15102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867337.15120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867337.15136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867337.15218: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867337.15274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867337.16970: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867337.17029: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867337.17114: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmp21jc_oqw /root/.ansible/tmp/ansible-tmp-1726867337.098784-20172-195810321460342/AnsiballZ_setup.py <<< 18662 1726867337.17118: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867337.098784-20172-195810321460342/AnsiballZ_setup.py" <<< 18662 1726867337.17213: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmp21jc_oqw" to remote "/root/.ansible/tmp/ansible-tmp-1726867337.098784-20172-195810321460342/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867337.098784-20172-195810321460342/AnsiballZ_setup.py" <<< 18662 1726867337.19469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867337.19472: stdout chunk (state=3): >>><<< 18662 1726867337.19485: stderr chunk (state=3): >>><<< 18662 1726867337.19684: done transferring module to remote 18662 1726867337.19687: _low_level_execute_command(): starting 18662 1726867337.19690: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867337.098784-20172-195810321460342/ /root/.ansible/tmp/ansible-tmp-1726867337.098784-20172-195810321460342/AnsiballZ_setup.py && sleep 0' 18662 1726867337.20600: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867337.20603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867337.20606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867337.20608: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867337.20610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867337.20694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867337.20731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867337.22653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867337.22659: stdout chunk (state=3): >>><<< 18662 1726867337.22662: stderr chunk (state=3): >>><<< 18662 1726867337.22685: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867337.22744: _low_level_execute_command(): starting 18662 1726867337.22748: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867337.098784-20172-195810321460342/AnsiballZ_setup.py && sleep 0' 18662 1726867337.23311: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867337.23322: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867337.23334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867337.23398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867337.23447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867337.23460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867337.23478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867337.23560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867337.88140: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2<<< 18662 1726867337.88170: stdout chunk (state=3): >>>c1e120", "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2950, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 581, "free": 2950}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 575, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794848768, "block_size": 4096, "block_total": 65519099, "block_available": 63914758, "block_used": 1604341, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "22", "second": "17", "epoch": "1726867337", "epoch_int": "1726867337", "date": "2024-09-20", "time": "17:22:17", "iso8601_micro": "2024-09-20T21:22:17.877766Z", "iso8601": "2024-09-20T21:22:17Z", "iso8601_basic": "20240920T172217877766", "iso8601_basic_short": "20240920T172217", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.3759765625, "5m": 0.37841796875, "15m": 0.20361328125}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18662 1726867337.90207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867337.90238: stderr chunk (state=3): >>><<< 18662 1726867337.90241: stdout chunk (state=3): >>><<< 18662 1726867337.90282: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2950, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 581, "free": 2950}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 575, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794848768, "block_size": 4096, "block_total": 65519099, "block_available": 63914758, "block_used": 1604341, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "22", "second": "17", "epoch": "1726867337", "epoch_int": "1726867337", "date": "2024-09-20", "time": "17:22:17", "iso8601_micro": "2024-09-20T21:22:17.877766Z", "iso8601": "2024-09-20T21:22:17Z", "iso8601_basic": "20240920T172217877766", "iso8601_basic_short": "20240920T172217", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.3759765625, "5m": 0.37841796875, "15m": 0.20361328125}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867337.90704: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867337.098784-20172-195810321460342/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867337.90787: _low_level_execute_command(): starting 18662 1726867337.90790: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867337.098784-20172-195810321460342/ > /dev/null 2>&1 && sleep 0' 18662 1726867337.91354: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867337.91466: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867337.91494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867337.91560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867337.93504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867337.93507: stdout chunk (state=3): >>><<< 18662 1726867337.93512: stderr chunk (state=3): >>><<< 18662 1726867337.93614: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867337.93618: handler run complete 18662 1726867337.93819: variable 'ansible_facts' from source: unknown 18662 1726867337.94082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867337.94672: variable 'ansible_facts' from source: unknown 18662 1726867337.94772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867337.94919: attempt loop complete, returning result 18662 1726867337.94929: _execute() done 18662 1726867337.94937: dumping result to json 18662 1726867337.94970: done dumping result, returning 18662 1726867337.94985: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-efab-a8ce-000000000417] 18662 1726867337.94995: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000417 18662 1726867337.95584: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000417 18662 1726867337.95588: WORKER PROCESS EXITING ok: [managed_node2] 18662 1726867337.96267: no more pending results, returning what we have 18662 1726867337.96270: results queue empty 18662 1726867337.96271: checking for any_errors_fatal 18662 1726867337.96274: done checking for any_errors_fatal 18662 1726867337.96275: checking for max_fail_percentage 18662 1726867337.96279: done checking for max_fail_percentage 18662 1726867337.96280: checking to see if all hosts have failed and the running result is not ok 18662 1726867337.96281: done checking to see if all hosts have failed 18662 1726867337.96281: getting the remaining hosts for this loop 18662 1726867337.96283: done getting the remaining hosts for this loop 18662 1726867337.96287: getting the next task for host managed_node2 18662 1726867337.96293: done getting next task for host managed_node2 18662 1726867337.96295: ^ task is: TASK: meta (flush_handlers) 18662 1726867337.96297: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867337.96300: getting variables 18662 1726867337.96302: in VariableManager get_vars() 18662 1726867337.96332: Calling all_inventory to load vars for managed_node2 18662 1726867337.96334: Calling groups_inventory to load vars for managed_node2 18662 1726867337.96337: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867337.96347: Calling all_plugins_play to load vars for managed_node2 18662 1726867337.96350: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867337.96354: Calling groups_plugins_play to load vars for managed_node2 18662 1726867337.99521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867338.03248: done with get_vars() 18662 1726867338.03281: done getting variables 18662 1726867338.03421: in VariableManager get_vars() 18662 1726867338.03551: Calling all_inventory to load vars for managed_node2 18662 1726867338.03554: Calling groups_inventory to load vars for managed_node2 18662 1726867338.03557: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867338.03562: Calling all_plugins_play to load vars for managed_node2 18662 1726867338.03564: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867338.03567: Calling groups_plugins_play to load vars for managed_node2 18662 1726867338.05824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867338.08243: done with get_vars() 18662 1726867338.08270: done queuing things up, now waiting for results queue to drain 18662 1726867338.08273: results queue empty 18662 1726867338.08273: checking for any_errors_fatal 18662 1726867338.08279: done checking for any_errors_fatal 18662 1726867338.08280: checking for max_fail_percentage 18662 1726867338.08281: done checking for max_fail_percentage 18662 1726867338.08282: checking to see if all hosts have failed and the running result is not ok 18662 1726867338.08283: done checking to see if all hosts have failed 18662 1726867338.08287: getting the remaining hosts for this loop 18662 1726867338.08288: done getting the remaining hosts for this loop 18662 1726867338.08291: getting the next task for host managed_node2 18662 1726867338.08295: done getting next task for host managed_node2 18662 1726867338.08298: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18662 1726867338.08300: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867338.08309: getting variables 18662 1726867338.08310: in VariableManager get_vars() 18662 1726867338.08324: Calling all_inventory to load vars for managed_node2 18662 1726867338.08327: Calling groups_inventory to load vars for managed_node2 18662 1726867338.08328: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867338.08333: Calling all_plugins_play to load vars for managed_node2 18662 1726867338.08336: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867338.08338: Calling groups_plugins_play to load vars for managed_node2 18662 1726867338.09830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867338.11781: done with get_vars() 18662 1726867338.11801: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:22:18 -0400 (0:00:01.079) 0:00:32.754 ****** 18662 1726867338.11879: entering _queue_task() for managed_node2/include_tasks 18662 1726867338.12216: worker is 1 (out of 1 available) 18662 1726867338.12231: exiting _queue_task() for managed_node2/include_tasks 18662 1726867338.12244: done queuing things up, now waiting for results queue to drain 18662 1726867338.12245: waiting for pending results... 18662 1726867338.12548: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 18662 1726867338.12618: in run() - task 0affcac9-a3a5-efab-a8ce-00000000005c 18662 1726867338.12643: variable 'ansible_search_path' from source: unknown 18662 1726867338.12652: variable 'ansible_search_path' from source: unknown 18662 1726867338.12695: calling self._execute() 18662 1726867338.12803: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867338.12820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867338.12836: variable 'omit' from source: magic vars 18662 1726867338.13302: variable 'ansible_distribution_major_version' from source: facts 18662 1726867338.13305: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867338.13311: _execute() done 18662 1726867338.13314: dumping result to json 18662 1726867338.13317: done dumping result, returning 18662 1726867338.13320: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-efab-a8ce-00000000005c] 18662 1726867338.13322: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000005c 18662 1726867338.13439: no more pending results, returning what we have 18662 1726867338.13445: in VariableManager get_vars() 18662 1726867338.13486: Calling all_inventory to load vars for managed_node2 18662 1726867338.13488: Calling groups_inventory to load vars for managed_node2 18662 1726867338.13491: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867338.13503: Calling all_plugins_play to load vars for managed_node2 18662 1726867338.13506: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867338.13511: Calling groups_plugins_play to load vars for managed_node2 18662 1726867338.14285: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000005c 18662 1726867338.14288: WORKER PROCESS EXITING 18662 1726867338.15267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867338.16849: done with get_vars() 18662 1726867338.16868: variable 'ansible_search_path' from source: unknown 18662 1726867338.16869: variable 'ansible_search_path' from source: unknown 18662 1726867338.16896: we have included files to process 18662 1726867338.16897: generating all_blocks data 18662 1726867338.16898: done generating all_blocks data 18662 1726867338.16899: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18662 1726867338.16900: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18662 1726867338.16902: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 18662 1726867338.17417: done processing included file 18662 1726867338.17419: iterating over new_blocks loaded from include file 18662 1726867338.17420: in VariableManager get_vars() 18662 1726867338.17442: done with get_vars() 18662 1726867338.17444: filtering new block on tags 18662 1726867338.17460: done filtering new block on tags 18662 1726867338.17463: in VariableManager get_vars() 18662 1726867338.17485: done with get_vars() 18662 1726867338.17487: filtering new block on tags 18662 1726867338.17505: done filtering new block on tags 18662 1726867338.17507: in VariableManager get_vars() 18662 1726867338.17529: done with get_vars() 18662 1726867338.17531: filtering new block on tags 18662 1726867338.17547: done filtering new block on tags 18662 1726867338.17549: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 18662 1726867338.17554: extending task lists for all hosts with included blocks 18662 1726867338.17931: done extending task lists 18662 1726867338.17933: done processing included files 18662 1726867338.17933: results queue empty 18662 1726867338.17934: checking for any_errors_fatal 18662 1726867338.17935: done checking for any_errors_fatal 18662 1726867338.17936: checking for max_fail_percentage 18662 1726867338.17937: done checking for max_fail_percentage 18662 1726867338.17938: checking to see if all hosts have failed and the running result is not ok 18662 1726867338.17938: done checking to see if all hosts have failed 18662 1726867338.17939: getting the remaining hosts for this loop 18662 1726867338.17940: done getting the remaining hosts for this loop 18662 1726867338.17943: getting the next task for host managed_node2 18662 1726867338.17946: done getting next task for host managed_node2 18662 1726867338.17949: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18662 1726867338.17952: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867338.17960: getting variables 18662 1726867338.17961: in VariableManager get_vars() 18662 1726867338.17974: Calling all_inventory to load vars for managed_node2 18662 1726867338.17976: Calling groups_inventory to load vars for managed_node2 18662 1726867338.17980: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867338.17985: Calling all_plugins_play to load vars for managed_node2 18662 1726867338.17988: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867338.17991: Calling groups_plugins_play to load vars for managed_node2 18662 1726867338.19144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867338.20694: done with get_vars() 18662 1726867338.20718: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:22:18 -0400 (0:00:00.089) 0:00:32.843 ****** 18662 1726867338.20788: entering _queue_task() for managed_node2/setup 18662 1726867338.21111: worker is 1 (out of 1 available) 18662 1726867338.21123: exiting _queue_task() for managed_node2/setup 18662 1726867338.21136: done queuing things up, now waiting for results queue to drain 18662 1726867338.21137: waiting for pending results... 18662 1726867338.21419: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 18662 1726867338.21559: in run() - task 0affcac9-a3a5-efab-a8ce-000000000458 18662 1726867338.21582: variable 'ansible_search_path' from source: unknown 18662 1726867338.21589: variable 'ansible_search_path' from source: unknown 18662 1726867338.21636: calling self._execute() 18662 1726867338.21736: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867338.21749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867338.21766: variable 'omit' from source: magic vars 18662 1726867338.22111: variable 'ansible_distribution_major_version' from source: facts 18662 1726867338.22126: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867338.22333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867338.24594: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867338.24663: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867338.24712: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867338.24759: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867338.24794: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867338.24887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867338.24927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867338.24962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867338.25013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867338.25035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867338.25099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867338.25131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867338.25160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867338.25212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867338.25233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867338.25583: variable '__network_required_facts' from source: role '' defaults 18662 1726867338.25586: variable 'ansible_facts' from source: unknown 18662 1726867338.26151: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 18662 1726867338.26160: when evaluation is False, skipping this task 18662 1726867338.26168: _execute() done 18662 1726867338.26175: dumping result to json 18662 1726867338.26184: done dumping result, returning 18662 1726867338.26196: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-efab-a8ce-000000000458] 18662 1726867338.26204: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000458 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18662 1726867338.26356: no more pending results, returning what we have 18662 1726867338.26361: results queue empty 18662 1726867338.26362: checking for any_errors_fatal 18662 1726867338.26364: done checking for any_errors_fatal 18662 1726867338.26365: checking for max_fail_percentage 18662 1726867338.26366: done checking for max_fail_percentage 18662 1726867338.26367: checking to see if all hosts have failed and the running result is not ok 18662 1726867338.26368: done checking to see if all hosts have failed 18662 1726867338.26369: getting the remaining hosts for this loop 18662 1726867338.26371: done getting the remaining hosts for this loop 18662 1726867338.26374: getting the next task for host managed_node2 18662 1726867338.26385: done getting next task for host managed_node2 18662 1726867338.26389: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 18662 1726867338.26393: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867338.26405: getting variables 18662 1726867338.26407: in VariableManager get_vars() 18662 1726867338.26452: Calling all_inventory to load vars for managed_node2 18662 1726867338.26456: Calling groups_inventory to load vars for managed_node2 18662 1726867338.26458: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867338.26470: Calling all_plugins_play to load vars for managed_node2 18662 1726867338.26473: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867338.26682: Calling groups_plugins_play to load vars for managed_node2 18662 1726867338.27390: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000458 18662 1726867338.27393: WORKER PROCESS EXITING 18662 1726867338.28480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867338.30164: done with get_vars() 18662 1726867338.30187: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:22:18 -0400 (0:00:00.094) 0:00:32.938 ****** 18662 1726867338.30284: entering _queue_task() for managed_node2/stat 18662 1726867338.30601: worker is 1 (out of 1 available) 18662 1726867338.30616: exiting _queue_task() for managed_node2/stat 18662 1726867338.30630: done queuing things up, now waiting for results queue to drain 18662 1726867338.30632: waiting for pending results... 18662 1726867338.30912: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 18662 1726867338.31058: in run() - task 0affcac9-a3a5-efab-a8ce-00000000045a 18662 1726867338.31080: variable 'ansible_search_path' from source: unknown 18662 1726867338.31089: variable 'ansible_search_path' from source: unknown 18662 1726867338.31139: calling self._execute() 18662 1726867338.31244: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867338.31257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867338.31274: variable 'omit' from source: magic vars 18662 1726867338.31670: variable 'ansible_distribution_major_version' from source: facts 18662 1726867338.31691: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867338.32070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867338.32563: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867338.32748: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867338.32823: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867338.32861: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867338.33101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867338.33113: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867338.33149: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867338.33180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867338.33280: variable '__network_is_ostree' from source: set_fact 18662 1726867338.33293: Evaluated conditional (not __network_is_ostree is defined): False 18662 1726867338.33300: when evaluation is False, skipping this task 18662 1726867338.33306: _execute() done 18662 1726867338.33316: dumping result to json 18662 1726867338.33323: done dumping result, returning 18662 1726867338.33333: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-efab-a8ce-00000000045a] 18662 1726867338.33342: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000045a skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18662 1726867338.33514: no more pending results, returning what we have 18662 1726867338.33519: results queue empty 18662 1726867338.33520: checking for any_errors_fatal 18662 1726867338.33526: done checking for any_errors_fatal 18662 1726867338.33527: checking for max_fail_percentage 18662 1726867338.33529: done checking for max_fail_percentage 18662 1726867338.33530: checking to see if all hosts have failed and the running result is not ok 18662 1726867338.33531: done checking to see if all hosts have failed 18662 1726867338.33531: getting the remaining hosts for this loop 18662 1726867338.33533: done getting the remaining hosts for this loop 18662 1726867338.33536: getting the next task for host managed_node2 18662 1726867338.33544: done getting next task for host managed_node2 18662 1726867338.33547: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18662 1726867338.33550: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867338.33564: getting variables 18662 1726867338.33565: in VariableManager get_vars() 18662 1726867338.33604: Calling all_inventory to load vars for managed_node2 18662 1726867338.33607: Calling groups_inventory to load vars for managed_node2 18662 1726867338.33612: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867338.33623: Calling all_plugins_play to load vars for managed_node2 18662 1726867338.33627: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867338.33630: Calling groups_plugins_play to load vars for managed_node2 18662 1726867338.34291: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000045a 18662 1726867338.34294: WORKER PROCESS EXITING 18662 1726867338.35202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867338.36940: done with get_vars() 18662 1726867338.36960: done getting variables 18662 1726867338.37022: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:22:18 -0400 (0:00:00.067) 0:00:33.006 ****** 18662 1726867338.37054: entering _queue_task() for managed_node2/set_fact 18662 1726867338.37323: worker is 1 (out of 1 available) 18662 1726867338.37334: exiting _queue_task() for managed_node2/set_fact 18662 1726867338.37347: done queuing things up, now waiting for results queue to drain 18662 1726867338.37348: waiting for pending results... 18662 1726867338.37612: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 18662 1726867338.37756: in run() - task 0affcac9-a3a5-efab-a8ce-00000000045b 18662 1726867338.37779: variable 'ansible_search_path' from source: unknown 18662 1726867338.37788: variable 'ansible_search_path' from source: unknown 18662 1726867338.37833: calling self._execute() 18662 1726867338.37927: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867338.37937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867338.37948: variable 'omit' from source: magic vars 18662 1726867338.38324: variable 'ansible_distribution_major_version' from source: facts 18662 1726867338.38343: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867338.38496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867338.38761: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867338.38812: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867338.38853: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867338.38982: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867338.38986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867338.39020: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867338.39052: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867338.39088: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867338.39184: variable '__network_is_ostree' from source: set_fact 18662 1726867338.39196: Evaluated conditional (not __network_is_ostree is defined): False 18662 1726867338.39203: when evaluation is False, skipping this task 18662 1726867338.39214: _execute() done 18662 1726867338.39223: dumping result to json 18662 1726867338.39230: done dumping result, returning 18662 1726867338.39241: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-efab-a8ce-00000000045b] 18662 1726867338.39249: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000045b skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 18662 1726867338.39385: no more pending results, returning what we have 18662 1726867338.39389: results queue empty 18662 1726867338.39390: checking for any_errors_fatal 18662 1726867338.39397: done checking for any_errors_fatal 18662 1726867338.39398: checking for max_fail_percentage 18662 1726867338.39400: done checking for max_fail_percentage 18662 1726867338.39401: checking to see if all hosts have failed and the running result is not ok 18662 1726867338.39402: done checking to see if all hosts have failed 18662 1726867338.39402: getting the remaining hosts for this loop 18662 1726867338.39405: done getting the remaining hosts for this loop 18662 1726867338.39411: getting the next task for host managed_node2 18662 1726867338.39422: done getting next task for host managed_node2 18662 1726867338.39425: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 18662 1726867338.39428: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867338.39442: getting variables 18662 1726867338.39444: in VariableManager get_vars() 18662 1726867338.39484: Calling all_inventory to load vars for managed_node2 18662 1726867338.39486: Calling groups_inventory to load vars for managed_node2 18662 1726867338.39489: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867338.39500: Calling all_plugins_play to load vars for managed_node2 18662 1726867338.39504: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867338.39507: Calling groups_plugins_play to load vars for managed_node2 18662 1726867338.40664: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000045b 18662 1726867338.40667: WORKER PROCESS EXITING 18662 1726867338.42312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867338.44302: done with get_vars() 18662 1726867338.44327: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:22:18 -0400 (0:00:00.073) 0:00:33.079 ****** 18662 1726867338.44420: entering _queue_task() for managed_node2/service_facts 18662 1726867338.44735: worker is 1 (out of 1 available) 18662 1726867338.44748: exiting _queue_task() for managed_node2/service_facts 18662 1726867338.44761: done queuing things up, now waiting for results queue to drain 18662 1726867338.44763: waiting for pending results... 18662 1726867338.45062: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 18662 1726867338.45211: in run() - task 0affcac9-a3a5-efab-a8ce-00000000045d 18662 1726867338.45232: variable 'ansible_search_path' from source: unknown 18662 1726867338.45241: variable 'ansible_search_path' from source: unknown 18662 1726867338.45584: calling self._execute() 18662 1726867338.45588: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867338.45590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867338.45606: variable 'omit' from source: magic vars 18662 1726867338.46315: variable 'ansible_distribution_major_version' from source: facts 18662 1726867338.46331: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867338.46340: variable 'omit' from source: magic vars 18662 1726867338.46441: variable 'omit' from source: magic vars 18662 1726867338.46540: variable 'omit' from source: magic vars 18662 1726867338.46583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867338.46672: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867338.46701: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867338.46727: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867338.46747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867338.46793: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867338.46802: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867338.46813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867338.46919: Set connection var ansible_timeout to 10 18662 1726867338.46927: Set connection var ansible_connection to ssh 18662 1726867338.46937: Set connection var ansible_shell_executable to /bin/sh 18662 1726867338.46943: Set connection var ansible_shell_type to sh 18662 1726867338.46960: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867338.46971: Set connection var ansible_pipelining to False 18662 1726867338.47001: variable 'ansible_shell_executable' from source: unknown 18662 1726867338.47012: variable 'ansible_connection' from source: unknown 18662 1726867338.47021: variable 'ansible_module_compression' from source: unknown 18662 1726867338.47028: variable 'ansible_shell_type' from source: unknown 18662 1726867338.47035: variable 'ansible_shell_executable' from source: unknown 18662 1726867338.47043: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867338.47052: variable 'ansible_pipelining' from source: unknown 18662 1726867338.47059: variable 'ansible_timeout' from source: unknown 18662 1726867338.47071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867338.47266: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867338.47288: variable 'omit' from source: magic vars 18662 1726867338.47299: starting attempt loop 18662 1726867338.47306: running the handler 18662 1726867338.47392: _low_level_execute_command(): starting 18662 1726867338.47396: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867338.48160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867338.48190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867338.48273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867338.49947: stdout chunk (state=3): >>>/root <<< 18662 1726867338.50109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867338.50112: stdout chunk (state=3): >>><<< 18662 1726867338.50114: stderr chunk (state=3): >>><<< 18662 1726867338.50165: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867338.50169: _low_level_execute_command(): starting 18662 1726867338.50172: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867338.5013266-20246-27257352799886 `" && echo ansible-tmp-1726867338.5013266-20246-27257352799886="` echo /root/.ansible/tmp/ansible-tmp-1726867338.5013266-20246-27257352799886 `" ) && sleep 0' 18662 1726867338.50752: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867338.50769: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867338.50833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867338.50901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867338.50955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867338.50972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867338.51011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867338.53003: stdout chunk (state=3): >>>ansible-tmp-1726867338.5013266-20246-27257352799886=/root/.ansible/tmp/ansible-tmp-1726867338.5013266-20246-27257352799886 <<< 18662 1726867338.53093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867338.53096: stdout chunk (state=3): >>><<< 18662 1726867338.53317: stderr chunk (state=3): >>><<< 18662 1726867338.53321: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867338.5013266-20246-27257352799886=/root/.ansible/tmp/ansible-tmp-1726867338.5013266-20246-27257352799886 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867338.53324: variable 'ansible_module_compression' from source: unknown 18662 1726867338.53327: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 18662 1726867338.53461: variable 'ansible_facts' from source: unknown 18662 1726867338.53704: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867338.5013266-20246-27257352799886/AnsiballZ_service_facts.py 18662 1726867338.53876: Sending initial data 18662 1726867338.53980: Sent initial data (161 bytes) 18662 1726867338.54444: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867338.54457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867338.54470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867338.54489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867338.54504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867338.54517: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867338.54597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867338.54625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867338.54641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867338.54652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867338.54739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867338.56349: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867338.56391: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867338.56440: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmp12p9l0_d /root/.ansible/tmp/ansible-tmp-1726867338.5013266-20246-27257352799886/AnsiballZ_service_facts.py <<< 18662 1726867338.56444: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867338.5013266-20246-27257352799886/AnsiballZ_service_facts.py" <<< 18662 1726867338.56476: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmp12p9l0_d" to remote "/root/.ansible/tmp/ansible-tmp-1726867338.5013266-20246-27257352799886/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867338.5013266-20246-27257352799886/AnsiballZ_service_facts.py" <<< 18662 1726867338.57214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867338.57340: stderr chunk (state=3): >>><<< 18662 1726867338.57343: stdout chunk (state=3): >>><<< 18662 1726867338.57346: done transferring module to remote 18662 1726867338.57348: _low_level_execute_command(): starting 18662 1726867338.57350: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867338.5013266-20246-27257352799886/ /root/.ansible/tmp/ansible-tmp-1726867338.5013266-20246-27257352799886/AnsiballZ_service_facts.py && sleep 0' 18662 1726867338.57860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867338.57874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867338.57898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867338.57923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867338.57941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867338.57953: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867338.57967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867338.58032: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867338.58075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867338.58095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867338.58118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867338.58192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867338.60211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867338.60214: stdout chunk (state=3): >>><<< 18662 1726867338.60217: stderr chunk (state=3): >>><<< 18662 1726867338.60219: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867338.60222: _low_level_execute_command(): starting 18662 1726867338.60224: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867338.5013266-20246-27257352799886/AnsiballZ_service_facts.py && sleep 0' 18662 1726867338.60745: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867338.60758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867338.60772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867338.60791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867338.60806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867338.60817: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867338.60829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867338.60898: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867338.60938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867338.60959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867338.60971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867338.61066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867340.26930: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 18662 1726867340.27008: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 18662 1726867340.27031: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 18662 1726867340.28570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867340.28647: stderr chunk (state=3): >>>Shared connection to 10.31.12.116 closed. <<< 18662 1726867340.28650: stdout chunk (state=3): >>><<< 18662 1726867340.28653: stderr chunk (state=3): >>><<< 18662 1726867340.28673: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867340.29484: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867338.5013266-20246-27257352799886/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867340.29487: _low_level_execute_command(): starting 18662 1726867340.29490: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867338.5013266-20246-27257352799886/ > /dev/null 2>&1 && sleep 0' 18662 1726867340.30089: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867340.30104: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867340.30121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867340.30141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867340.30187: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867340.30258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867340.30284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867340.30329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867340.30383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867340.32375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867340.32380: stdout chunk (state=3): >>><<< 18662 1726867340.32382: stderr chunk (state=3): >>><<< 18662 1726867340.32583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867340.32587: handler run complete 18662 1726867340.32638: variable 'ansible_facts' from source: unknown 18662 1726867340.32806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867340.33595: variable 'ansible_facts' from source: unknown 18662 1726867340.33731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867340.33936: attempt loop complete, returning result 18662 1726867340.33948: _execute() done 18662 1726867340.33956: dumping result to json 18662 1726867340.34023: done dumping result, returning 18662 1726867340.34037: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-efab-a8ce-00000000045d] 18662 1726867340.34047: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000045d 18662 1726867340.35390: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000045d 18662 1726867340.35394: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18662 1726867340.35503: no more pending results, returning what we have 18662 1726867340.35506: results queue empty 18662 1726867340.35507: checking for any_errors_fatal 18662 1726867340.35513: done checking for any_errors_fatal 18662 1726867340.35514: checking for max_fail_percentage 18662 1726867340.35516: done checking for max_fail_percentage 18662 1726867340.35516: checking to see if all hosts have failed and the running result is not ok 18662 1726867340.35517: done checking to see if all hosts have failed 18662 1726867340.35518: getting the remaining hosts for this loop 18662 1726867340.35519: done getting the remaining hosts for this loop 18662 1726867340.35522: getting the next task for host managed_node2 18662 1726867340.35528: done getting next task for host managed_node2 18662 1726867340.35533: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 18662 1726867340.35536: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867340.35546: getting variables 18662 1726867340.35547: in VariableManager get_vars() 18662 1726867340.35583: Calling all_inventory to load vars for managed_node2 18662 1726867340.35587: Calling groups_inventory to load vars for managed_node2 18662 1726867340.35590: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867340.35599: Calling all_plugins_play to load vars for managed_node2 18662 1726867340.35602: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867340.35605: Calling groups_plugins_play to load vars for managed_node2 18662 1726867340.36854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867340.38451: done with get_vars() 18662 1726867340.38475: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:22:20 -0400 (0:00:01.941) 0:00:35.021 ****** 18662 1726867340.38582: entering _queue_task() for managed_node2/package_facts 18662 1726867340.38946: worker is 1 (out of 1 available) 18662 1726867340.38961: exiting _queue_task() for managed_node2/package_facts 18662 1726867340.38975: done queuing things up, now waiting for results queue to drain 18662 1726867340.38978: waiting for pending results... 18662 1726867340.39300: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 18662 1726867340.39405: in run() - task 0affcac9-a3a5-efab-a8ce-00000000045e 18662 1726867340.39426: variable 'ansible_search_path' from source: unknown 18662 1726867340.39435: variable 'ansible_search_path' from source: unknown 18662 1726867340.39479: calling self._execute() 18662 1726867340.39620: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867340.39623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867340.39627: variable 'omit' from source: magic vars 18662 1726867340.39998: variable 'ansible_distribution_major_version' from source: facts 18662 1726867340.40016: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867340.40027: variable 'omit' from source: magic vars 18662 1726867340.40162: variable 'omit' from source: magic vars 18662 1726867340.40165: variable 'omit' from source: magic vars 18662 1726867340.40184: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867340.40223: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867340.40250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867340.40280: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867340.40297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867340.40331: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867340.40341: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867340.40350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867340.40462: Set connection var ansible_timeout to 10 18662 1726867340.40583: Set connection var ansible_connection to ssh 18662 1726867340.40590: Set connection var ansible_shell_executable to /bin/sh 18662 1726867340.40593: Set connection var ansible_shell_type to sh 18662 1726867340.40602: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867340.40613: Set connection var ansible_pipelining to False 18662 1726867340.40810: variable 'ansible_shell_executable' from source: unknown 18662 1726867340.40813: variable 'ansible_connection' from source: unknown 18662 1726867340.40816: variable 'ansible_module_compression' from source: unknown 18662 1726867340.40819: variable 'ansible_shell_type' from source: unknown 18662 1726867340.40821: variable 'ansible_shell_executable' from source: unknown 18662 1726867340.40823: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867340.40826: variable 'ansible_pipelining' from source: unknown 18662 1726867340.40828: variable 'ansible_timeout' from source: unknown 18662 1726867340.40830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867340.41172: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867340.41257: variable 'omit' from source: magic vars 18662 1726867340.41266: starting attempt loop 18662 1726867340.41272: running the handler 18662 1726867340.41290: _low_level_execute_command(): starting 18662 1726867340.41300: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867340.42431: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867340.42488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867340.42558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867340.42582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867340.42613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867340.42697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867340.44595: stdout chunk (state=3): >>>/root <<< 18662 1726867340.44599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867340.44602: stdout chunk (state=3): >>><<< 18662 1726867340.44604: stderr chunk (state=3): >>><<< 18662 1726867340.44608: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867340.44783: _low_level_execute_command(): starting 18662 1726867340.44787: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867340.4470758-20343-237627981920884 `" && echo ansible-tmp-1726867340.4470758-20343-237627981920884="` echo /root/.ansible/tmp/ansible-tmp-1726867340.4470758-20343-237627981920884 `" ) && sleep 0' 18662 1726867340.45920: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867340.45924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867340.45930: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867340.45940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867340.46130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867340.46166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867340.46239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867340.48231: stdout chunk (state=3): >>>ansible-tmp-1726867340.4470758-20343-237627981920884=/root/.ansible/tmp/ansible-tmp-1726867340.4470758-20343-237627981920884 <<< 18662 1726867340.48344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867340.48635: stderr chunk (state=3): >>><<< 18662 1726867340.48638: stdout chunk (state=3): >>><<< 18662 1726867340.48641: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867340.4470758-20343-237627981920884=/root/.ansible/tmp/ansible-tmp-1726867340.4470758-20343-237627981920884 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867340.48643: variable 'ansible_module_compression' from source: unknown 18662 1726867340.48646: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 18662 1726867340.48844: variable 'ansible_facts' from source: unknown 18662 1726867340.49193: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867340.4470758-20343-237627981920884/AnsiballZ_package_facts.py 18662 1726867340.49599: Sending initial data 18662 1726867340.49612: Sent initial data (162 bytes) 18662 1726867340.50551: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867340.50593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867340.50630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867340.50641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867340.51044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867340.51096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867340.52765: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867340.52871: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867340.52997: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpua5oawpc /root/.ansible/tmp/ansible-tmp-1726867340.4470758-20343-237627981920884/AnsiballZ_package_facts.py <<< 18662 1726867340.53000: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867340.4470758-20343-237627981920884/AnsiballZ_package_facts.py" <<< 18662 1726867340.53033: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpua5oawpc" to remote "/root/.ansible/tmp/ansible-tmp-1726867340.4470758-20343-237627981920884/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867340.4470758-20343-237627981920884/AnsiballZ_package_facts.py" <<< 18662 1726867340.55797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867340.55883: stderr chunk (state=3): >>><<< 18662 1726867340.55887: stdout chunk (state=3): >>><<< 18662 1726867340.55889: done transferring module to remote 18662 1726867340.55891: _low_level_execute_command(): starting 18662 1726867340.55894: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867340.4470758-20343-237627981920884/ /root/.ansible/tmp/ansible-tmp-1726867340.4470758-20343-237627981920884/AnsiballZ_package_facts.py && sleep 0' 18662 1726867340.56685: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867340.56701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867340.56721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867340.56744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867340.56853: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867340.56886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867340.56914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867340.56930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867340.57197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867340.59014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867340.59018: stdout chunk (state=3): >>><<< 18662 1726867340.59020: stderr chunk (state=3): >>><<< 18662 1726867340.59023: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867340.59025: _low_level_execute_command(): starting 18662 1726867340.59028: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867340.4470758-20343-237627981920884/AnsiballZ_package_facts.py && sleep 0' 18662 1726867340.59966: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867340.59981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867340.59996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867340.60093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867340.60116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867340.60134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867340.60144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867340.60416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867341.05479: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 18662 1726867341.05505: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 18662 1726867341.05675: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 18662 1726867341.05696: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 18662 1726867341.05701: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 18662 1726867341.07590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867341.07593: stdout chunk (state=3): >>><<< 18662 1726867341.07596: stderr chunk (state=3): >>><<< 18662 1726867341.07935: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867341.17966: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867340.4470758-20343-237627981920884/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867341.18005: _low_level_execute_command(): starting 18662 1726867341.18021: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867340.4470758-20343-237627981920884/ > /dev/null 2>&1 && sleep 0' 18662 1726867341.19175: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867341.19281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867341.19418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867341.19513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867341.21983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867341.21987: stdout chunk (state=3): >>><<< 18662 1726867341.21989: stderr chunk (state=3): >>><<< 18662 1726867341.21992: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867341.21995: handler run complete 18662 1726867341.23452: variable 'ansible_facts' from source: unknown 18662 1726867341.24025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867341.26612: variable 'ansible_facts' from source: unknown 18662 1726867341.27064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867341.27723: attempt loop complete, returning result 18662 1726867341.27741: _execute() done 18662 1726867341.27750: dumping result to json 18662 1726867341.27941: done dumping result, returning 18662 1726867341.27954: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-efab-a8ce-00000000045e] 18662 1726867341.27963: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000045e 18662 1726867341.34240: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000045e 18662 1726867341.34243: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18662 1726867341.34354: no more pending results, returning what we have 18662 1726867341.34356: results queue empty 18662 1726867341.34356: checking for any_errors_fatal 18662 1726867341.34360: done checking for any_errors_fatal 18662 1726867341.34361: checking for max_fail_percentage 18662 1726867341.34362: done checking for max_fail_percentage 18662 1726867341.34363: checking to see if all hosts have failed and the running result is not ok 18662 1726867341.34364: done checking to see if all hosts have failed 18662 1726867341.34365: getting the remaining hosts for this loop 18662 1726867341.34366: done getting the remaining hosts for this loop 18662 1726867341.34368: getting the next task for host managed_node2 18662 1726867341.34373: done getting next task for host managed_node2 18662 1726867341.34375: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 18662 1726867341.34379: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867341.34387: getting variables 18662 1726867341.34388: in VariableManager get_vars() 18662 1726867341.34407: Calling all_inventory to load vars for managed_node2 18662 1726867341.34409: Calling groups_inventory to load vars for managed_node2 18662 1726867341.34411: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867341.34417: Calling all_plugins_play to load vars for managed_node2 18662 1726867341.34420: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867341.34423: Calling groups_plugins_play to load vars for managed_node2 18662 1726867341.35701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867341.37217: done with get_vars() 18662 1726867341.37238: done getting variables 18662 1726867341.37291: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:22:21 -0400 (0:00:00.987) 0:00:36.008 ****** 18662 1726867341.37317: entering _queue_task() for managed_node2/debug 18662 1726867341.37646: worker is 1 (out of 1 available) 18662 1726867341.37657: exiting _queue_task() for managed_node2/debug 18662 1726867341.37670: done queuing things up, now waiting for results queue to drain 18662 1726867341.37671: waiting for pending results... 18662 1726867341.38094: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 18662 1726867341.38098: in run() - task 0affcac9-a3a5-efab-a8ce-00000000005d 18662 1726867341.38101: variable 'ansible_search_path' from source: unknown 18662 1726867341.38103: variable 'ansible_search_path' from source: unknown 18662 1726867341.38106: calling self._execute() 18662 1726867341.38175: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867341.38193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867341.38209: variable 'omit' from source: magic vars 18662 1726867341.38594: variable 'ansible_distribution_major_version' from source: facts 18662 1726867341.38611: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867341.38622: variable 'omit' from source: magic vars 18662 1726867341.38670: variable 'omit' from source: magic vars 18662 1726867341.38771: variable 'network_provider' from source: set_fact 18662 1726867341.38797: variable 'omit' from source: magic vars 18662 1726867341.38840: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867341.38886: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867341.38911: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867341.38937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867341.38951: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867341.38991: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867341.39000: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867341.39009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867341.39116: Set connection var ansible_timeout to 10 18662 1726867341.39182: Set connection var ansible_connection to ssh 18662 1726867341.39185: Set connection var ansible_shell_executable to /bin/sh 18662 1726867341.39193: Set connection var ansible_shell_type to sh 18662 1726867341.39195: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867341.39197: Set connection var ansible_pipelining to False 18662 1726867341.39199: variable 'ansible_shell_executable' from source: unknown 18662 1726867341.39202: variable 'ansible_connection' from source: unknown 18662 1726867341.39204: variable 'ansible_module_compression' from source: unknown 18662 1726867341.39206: variable 'ansible_shell_type' from source: unknown 18662 1726867341.39210: variable 'ansible_shell_executable' from source: unknown 18662 1726867341.39218: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867341.39227: variable 'ansible_pipelining' from source: unknown 18662 1726867341.39235: variable 'ansible_timeout' from source: unknown 18662 1726867341.39243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867341.39382: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867341.39398: variable 'omit' from source: magic vars 18662 1726867341.39520: starting attempt loop 18662 1726867341.39523: running the handler 18662 1726867341.39526: handler run complete 18662 1726867341.39528: attempt loop complete, returning result 18662 1726867341.39530: _execute() done 18662 1726867341.39532: dumping result to json 18662 1726867341.39535: done dumping result, returning 18662 1726867341.39538: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-efab-a8ce-00000000005d] 18662 1726867341.39540: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000005d 18662 1726867341.39613: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000005d 18662 1726867341.39619: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 18662 1726867341.39673: no more pending results, returning what we have 18662 1726867341.39678: results queue empty 18662 1726867341.39680: checking for any_errors_fatal 18662 1726867341.39691: done checking for any_errors_fatal 18662 1726867341.39692: checking for max_fail_percentage 18662 1726867341.39694: done checking for max_fail_percentage 18662 1726867341.39694: checking to see if all hosts have failed and the running result is not ok 18662 1726867341.39695: done checking to see if all hosts have failed 18662 1726867341.39696: getting the remaining hosts for this loop 18662 1726867341.39698: done getting the remaining hosts for this loop 18662 1726867341.39702: getting the next task for host managed_node2 18662 1726867341.39709: done getting next task for host managed_node2 18662 1726867341.39712: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18662 1726867341.39714: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867341.39724: getting variables 18662 1726867341.39726: in VariableManager get_vars() 18662 1726867341.39761: Calling all_inventory to load vars for managed_node2 18662 1726867341.39764: Calling groups_inventory to load vars for managed_node2 18662 1726867341.39766: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867341.39879: Calling all_plugins_play to load vars for managed_node2 18662 1726867341.39885: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867341.39889: Calling groups_plugins_play to load vars for managed_node2 18662 1726867341.41345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867341.42898: done with get_vars() 18662 1726867341.42920: done getting variables 18662 1726867341.42973: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:22:21 -0400 (0:00:00.056) 0:00:36.065 ****** 18662 1726867341.43004: entering _queue_task() for managed_node2/fail 18662 1726867341.43307: worker is 1 (out of 1 available) 18662 1726867341.43318: exiting _queue_task() for managed_node2/fail 18662 1726867341.43330: done queuing things up, now waiting for results queue to drain 18662 1726867341.43331: waiting for pending results... 18662 1726867341.43593: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 18662 1726867341.43724: in run() - task 0affcac9-a3a5-efab-a8ce-00000000005e 18662 1726867341.43744: variable 'ansible_search_path' from source: unknown 18662 1726867341.43752: variable 'ansible_search_path' from source: unknown 18662 1726867341.43800: calling self._execute() 18662 1726867341.43892: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867341.43910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867341.43925: variable 'omit' from source: magic vars 18662 1726867341.44314: variable 'ansible_distribution_major_version' from source: facts 18662 1726867341.44334: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867341.44464: variable 'network_state' from source: role '' defaults 18662 1726867341.44560: Evaluated conditional (network_state != {}): False 18662 1726867341.44564: when evaluation is False, skipping this task 18662 1726867341.44566: _execute() done 18662 1726867341.44569: dumping result to json 18662 1726867341.44571: done dumping result, returning 18662 1726867341.44573: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-efab-a8ce-00000000005e] 18662 1726867341.44576: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000005e 18662 1726867341.44653: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000005e 18662 1726867341.44657: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18662 1726867341.44713: no more pending results, returning what we have 18662 1726867341.44718: results queue empty 18662 1726867341.44719: checking for any_errors_fatal 18662 1726867341.44726: done checking for any_errors_fatal 18662 1726867341.44727: checking for max_fail_percentage 18662 1726867341.44729: done checking for max_fail_percentage 18662 1726867341.44730: checking to see if all hosts have failed and the running result is not ok 18662 1726867341.44731: done checking to see if all hosts have failed 18662 1726867341.44731: getting the remaining hosts for this loop 18662 1726867341.44732: done getting the remaining hosts for this loop 18662 1726867341.44736: getting the next task for host managed_node2 18662 1726867341.44744: done getting next task for host managed_node2 18662 1726867341.44747: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18662 1726867341.44750: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867341.44765: getting variables 18662 1726867341.44767: in VariableManager get_vars() 18662 1726867341.44809: Calling all_inventory to load vars for managed_node2 18662 1726867341.44812: Calling groups_inventory to load vars for managed_node2 18662 1726867341.44814: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867341.44826: Calling all_plugins_play to load vars for managed_node2 18662 1726867341.44829: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867341.44831: Calling groups_plugins_play to load vars for managed_node2 18662 1726867341.46457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867341.47995: done with get_vars() 18662 1726867341.48017: done getting variables 18662 1726867341.48079: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:22:21 -0400 (0:00:00.051) 0:00:36.116 ****** 18662 1726867341.48110: entering _queue_task() for managed_node2/fail 18662 1726867341.48419: worker is 1 (out of 1 available) 18662 1726867341.48430: exiting _queue_task() for managed_node2/fail 18662 1726867341.48444: done queuing things up, now waiting for results queue to drain 18662 1726867341.48445: waiting for pending results... 18662 1726867341.48759: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 18662 1726867341.48804: in run() - task 0affcac9-a3a5-efab-a8ce-00000000005f 18662 1726867341.48823: variable 'ansible_search_path' from source: unknown 18662 1726867341.48830: variable 'ansible_search_path' from source: unknown 18662 1726867341.48876: calling self._execute() 18662 1726867341.48991: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867341.49005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867341.49021: variable 'omit' from source: magic vars 18662 1726867341.49423: variable 'ansible_distribution_major_version' from source: facts 18662 1726867341.49441: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867341.49569: variable 'network_state' from source: role '' defaults 18662 1726867341.49590: Evaluated conditional (network_state != {}): False 18662 1726867341.49598: when evaluation is False, skipping this task 18662 1726867341.49606: _execute() done 18662 1726867341.49623: dumping result to json 18662 1726867341.49632: done dumping result, returning 18662 1726867341.49644: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-efab-a8ce-00000000005f] 18662 1726867341.49656: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000005f 18662 1726867341.49871: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000005f 18662 1726867341.49875: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18662 1726867341.49921: no more pending results, returning what we have 18662 1726867341.49925: results queue empty 18662 1726867341.49926: checking for any_errors_fatal 18662 1726867341.49934: done checking for any_errors_fatal 18662 1726867341.49935: checking for max_fail_percentage 18662 1726867341.49936: done checking for max_fail_percentage 18662 1726867341.49937: checking to see if all hosts have failed and the running result is not ok 18662 1726867341.49937: done checking to see if all hosts have failed 18662 1726867341.49938: getting the remaining hosts for this loop 18662 1726867341.49939: done getting the remaining hosts for this loop 18662 1726867341.49942: getting the next task for host managed_node2 18662 1726867341.49947: done getting next task for host managed_node2 18662 1726867341.49950: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18662 1726867341.49952: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867341.49966: getting variables 18662 1726867341.49968: in VariableManager get_vars() 18662 1726867341.50001: Calling all_inventory to load vars for managed_node2 18662 1726867341.50003: Calling groups_inventory to load vars for managed_node2 18662 1726867341.50005: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867341.50014: Calling all_plugins_play to load vars for managed_node2 18662 1726867341.50017: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867341.50019: Calling groups_plugins_play to load vars for managed_node2 18662 1726867341.51562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867341.53403: done with get_vars() 18662 1726867341.53429: done getting variables 18662 1726867341.53493: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:22:21 -0400 (0:00:00.054) 0:00:36.170 ****** 18662 1726867341.53533: entering _queue_task() for managed_node2/fail 18662 1726867341.53880: worker is 1 (out of 1 available) 18662 1726867341.53893: exiting _queue_task() for managed_node2/fail 18662 1726867341.53906: done queuing things up, now waiting for results queue to drain 18662 1726867341.53907: waiting for pending results... 18662 1726867341.54294: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 18662 1726867341.54298: in run() - task 0affcac9-a3a5-efab-a8ce-000000000060 18662 1726867341.54301: variable 'ansible_search_path' from source: unknown 18662 1726867341.54304: variable 'ansible_search_path' from source: unknown 18662 1726867341.54383: calling self._execute() 18662 1726867341.54429: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867341.54440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867341.54453: variable 'omit' from source: magic vars 18662 1726867341.54806: variable 'ansible_distribution_major_version' from source: facts 18662 1726867341.54826: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867341.55010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867341.57395: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867341.57558: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867341.57561: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867341.57563: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867341.57594: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867341.57689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.57741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.57785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.57835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.57857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.57963: variable 'ansible_distribution_major_version' from source: facts 18662 1726867341.57991: Evaluated conditional (ansible_distribution_major_version | int > 9): True 18662 1726867341.58124: variable 'ansible_distribution' from source: facts 18662 1726867341.58134: variable '__network_rh_distros' from source: role '' defaults 18662 1726867341.58147: Evaluated conditional (ansible_distribution in __network_rh_distros): True 18662 1726867341.58431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.58454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.58540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.58544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.58557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.58615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.58653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.58686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.58732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.58759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.58808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.58867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.58876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.58925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.58976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.59275: variable 'network_connections' from source: play vars 18662 1726867341.59292: variable 'profile' from source: play vars 18662 1726867341.59366: variable 'profile' from source: play vars 18662 1726867341.59374: variable 'interface' from source: set_fact 18662 1726867341.59481: variable 'interface' from source: set_fact 18662 1726867341.59485: variable 'network_state' from source: role '' defaults 18662 1726867341.59530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867341.59692: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867341.59738: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867341.59771: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867341.59803: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867341.59852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867341.59954: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867341.59958: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.59961: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867341.59988: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 18662 1726867341.59996: when evaluation is False, skipping this task 18662 1726867341.60003: _execute() done 18662 1726867341.60012: dumping result to json 18662 1726867341.60020: done dumping result, returning 18662 1726867341.60032: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-efab-a8ce-000000000060] 18662 1726867341.60040: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000060 skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 18662 1726867341.60219: no more pending results, returning what we have 18662 1726867341.60224: results queue empty 18662 1726867341.60225: checking for any_errors_fatal 18662 1726867341.60232: done checking for any_errors_fatal 18662 1726867341.60233: checking for max_fail_percentage 18662 1726867341.60235: done checking for max_fail_percentage 18662 1726867341.60236: checking to see if all hosts have failed and the running result is not ok 18662 1726867341.60236: done checking to see if all hosts have failed 18662 1726867341.60237: getting the remaining hosts for this loop 18662 1726867341.60238: done getting the remaining hosts for this loop 18662 1726867341.60242: getting the next task for host managed_node2 18662 1726867341.60249: done getting next task for host managed_node2 18662 1726867341.60253: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18662 1726867341.60255: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867341.60268: getting variables 18662 1726867341.60270: in VariableManager get_vars() 18662 1726867341.60315: Calling all_inventory to load vars for managed_node2 18662 1726867341.60318: Calling groups_inventory to load vars for managed_node2 18662 1726867341.60321: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867341.60332: Calling all_plugins_play to load vars for managed_node2 18662 1726867341.60336: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867341.60339: Calling groups_plugins_play to load vars for managed_node2 18662 1726867341.61096: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000060 18662 1726867341.61100: WORKER PROCESS EXITING 18662 1726867341.62084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867341.63872: done with get_vars() 18662 1726867341.63896: done getting variables 18662 1726867341.63960: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:22:21 -0400 (0:00:00.104) 0:00:36.275 ****** 18662 1726867341.63990: entering _queue_task() for managed_node2/dnf 18662 1726867341.64311: worker is 1 (out of 1 available) 18662 1726867341.64322: exiting _queue_task() for managed_node2/dnf 18662 1726867341.64333: done queuing things up, now waiting for results queue to drain 18662 1726867341.64335: waiting for pending results... 18662 1726867341.64621: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 18662 1726867341.64739: in run() - task 0affcac9-a3a5-efab-a8ce-000000000061 18662 1726867341.64759: variable 'ansible_search_path' from source: unknown 18662 1726867341.64768: variable 'ansible_search_path' from source: unknown 18662 1726867341.64814: calling self._execute() 18662 1726867341.64917: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867341.64936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867341.64953: variable 'omit' from source: magic vars 18662 1726867341.65334: variable 'ansible_distribution_major_version' from source: facts 18662 1726867341.65352: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867341.65566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867341.67834: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867341.67917: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867341.67960: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867341.68013: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867341.68047: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867341.68139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.68537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.68568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.68626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.68646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.68768: variable 'ansible_distribution' from source: facts 18662 1726867341.68779: variable 'ansible_distribution_major_version' from source: facts 18662 1726867341.68798: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 18662 1726867341.68931: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867341.69067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.69099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.69140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.69185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.69207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.69282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.69290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.69322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.69373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.69462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.69465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.69467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.69498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.69543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.69562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.69734: variable 'network_connections' from source: play vars 18662 1726867341.69750: variable 'profile' from source: play vars 18662 1726867341.69831: variable 'profile' from source: play vars 18662 1726867341.69882: variable 'interface' from source: set_fact 18662 1726867341.69911: variable 'interface' from source: set_fact 18662 1726867341.69989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867341.70176: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867341.70222: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867341.70264: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867341.70347: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867341.70354: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867341.70383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867341.70457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.70463: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867341.70512: variable '__network_team_connections_defined' from source: role '' defaults 18662 1726867341.70760: variable 'network_connections' from source: play vars 18662 1726867341.70784: variable 'profile' from source: play vars 18662 1726867341.70883: variable 'profile' from source: play vars 18662 1726867341.70888: variable 'interface' from source: set_fact 18662 1726867341.70917: variable 'interface' from source: set_fact 18662 1726867341.70943: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18662 1726867341.70949: when evaluation is False, skipping this task 18662 1726867341.70955: _execute() done 18662 1726867341.70960: dumping result to json 18662 1726867341.70965: done dumping result, returning 18662 1726867341.70974: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-efab-a8ce-000000000061] 18662 1726867341.70984: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000061 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18662 1726867341.71157: no more pending results, returning what we have 18662 1726867341.71161: results queue empty 18662 1726867341.71162: checking for any_errors_fatal 18662 1726867341.71168: done checking for any_errors_fatal 18662 1726867341.71168: checking for max_fail_percentage 18662 1726867341.71170: done checking for max_fail_percentage 18662 1726867341.71171: checking to see if all hosts have failed and the running result is not ok 18662 1726867341.71171: done checking to see if all hosts have failed 18662 1726867341.71172: getting the remaining hosts for this loop 18662 1726867341.71173: done getting the remaining hosts for this loop 18662 1726867341.71178: getting the next task for host managed_node2 18662 1726867341.71185: done getting next task for host managed_node2 18662 1726867341.71188: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18662 1726867341.71190: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867341.71202: getting variables 18662 1726867341.71204: in VariableManager get_vars() 18662 1726867341.71245: Calling all_inventory to load vars for managed_node2 18662 1726867341.71247: Calling groups_inventory to load vars for managed_node2 18662 1726867341.71249: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867341.71259: Calling all_plugins_play to load vars for managed_node2 18662 1726867341.71262: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867341.71265: Calling groups_plugins_play to load vars for managed_node2 18662 1726867341.71893: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000061 18662 1726867341.71897: WORKER PROCESS EXITING 18662 1726867341.72995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867341.74612: done with get_vars() 18662 1726867341.74639: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 18662 1726867341.74716: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:22:21 -0400 (0:00:00.107) 0:00:36.382 ****** 18662 1726867341.74745: entering _queue_task() for managed_node2/yum 18662 1726867341.75064: worker is 1 (out of 1 available) 18662 1726867341.75079: exiting _queue_task() for managed_node2/yum 18662 1726867341.75091: done queuing things up, now waiting for results queue to drain 18662 1726867341.75092: waiting for pending results... 18662 1726867341.75272: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 18662 1726867341.75354: in run() - task 0affcac9-a3a5-efab-a8ce-000000000062 18662 1726867341.75365: variable 'ansible_search_path' from source: unknown 18662 1726867341.75370: variable 'ansible_search_path' from source: unknown 18662 1726867341.75400: calling self._execute() 18662 1726867341.75472: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867341.75476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867341.75489: variable 'omit' from source: magic vars 18662 1726867341.75774: variable 'ansible_distribution_major_version' from source: facts 18662 1726867341.75787: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867341.75913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867341.77529: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867341.77573: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867341.77602: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867341.77634: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867341.77656: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867341.77715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.77744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.77764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.77792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.77803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.77868: variable 'ansible_distribution_major_version' from source: facts 18662 1726867341.77880: Evaluated conditional (ansible_distribution_major_version | int < 8): False 18662 1726867341.77883: when evaluation is False, skipping this task 18662 1726867341.77886: _execute() done 18662 1726867341.77889: dumping result to json 18662 1726867341.77891: done dumping result, returning 18662 1726867341.77898: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-efab-a8ce-000000000062] 18662 1726867341.77901: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000062 18662 1726867341.77991: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000062 18662 1726867341.77994: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 18662 1726867341.78043: no more pending results, returning what we have 18662 1726867341.78046: results queue empty 18662 1726867341.78047: checking for any_errors_fatal 18662 1726867341.78057: done checking for any_errors_fatal 18662 1726867341.78058: checking for max_fail_percentage 18662 1726867341.78059: done checking for max_fail_percentage 18662 1726867341.78060: checking to see if all hosts have failed and the running result is not ok 18662 1726867341.78061: done checking to see if all hosts have failed 18662 1726867341.78061: getting the remaining hosts for this loop 18662 1726867341.78062: done getting the remaining hosts for this loop 18662 1726867341.78066: getting the next task for host managed_node2 18662 1726867341.78072: done getting next task for host managed_node2 18662 1726867341.78075: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18662 1726867341.78079: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867341.78091: getting variables 18662 1726867341.78093: in VariableManager get_vars() 18662 1726867341.78130: Calling all_inventory to load vars for managed_node2 18662 1726867341.78132: Calling groups_inventory to load vars for managed_node2 18662 1726867341.78134: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867341.78143: Calling all_plugins_play to load vars for managed_node2 18662 1726867341.78146: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867341.78148: Calling groups_plugins_play to load vars for managed_node2 18662 1726867341.79211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867341.80076: done with get_vars() 18662 1726867341.80094: done getting variables 18662 1726867341.80134: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:22:21 -0400 (0:00:00.054) 0:00:36.437 ****** 18662 1726867341.80154: entering _queue_task() for managed_node2/fail 18662 1726867341.80359: worker is 1 (out of 1 available) 18662 1726867341.80370: exiting _queue_task() for managed_node2/fail 18662 1726867341.80383: done queuing things up, now waiting for results queue to drain 18662 1726867341.80385: waiting for pending results... 18662 1726867341.80552: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 18662 1726867341.80627: in run() - task 0affcac9-a3a5-efab-a8ce-000000000063 18662 1726867341.80638: variable 'ansible_search_path' from source: unknown 18662 1726867341.80641: variable 'ansible_search_path' from source: unknown 18662 1726867341.80670: calling self._execute() 18662 1726867341.80746: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867341.80750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867341.80759: variable 'omit' from source: magic vars 18662 1726867341.81030: variable 'ansible_distribution_major_version' from source: facts 18662 1726867341.81040: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867341.81121: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867341.81247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867341.82907: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867341.82951: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867341.82979: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867341.83006: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867341.83030: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867341.83095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.83120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.83138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.83164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.83175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.83209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.83230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.83246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.83271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.83283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.83325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.83342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.83358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.83383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.83393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.83499: variable 'network_connections' from source: play vars 18662 1726867341.83508: variable 'profile' from source: play vars 18662 1726867341.83560: variable 'profile' from source: play vars 18662 1726867341.83563: variable 'interface' from source: set_fact 18662 1726867341.83606: variable 'interface' from source: set_fact 18662 1726867341.83667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867341.83784: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867341.83810: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867341.83834: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867341.83854: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867341.83887: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867341.83902: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867341.83922: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.83940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867341.83975: variable '__network_team_connections_defined' from source: role '' defaults 18662 1726867341.84124: variable 'network_connections' from source: play vars 18662 1726867341.84128: variable 'profile' from source: play vars 18662 1726867341.84169: variable 'profile' from source: play vars 18662 1726867341.84173: variable 'interface' from source: set_fact 18662 1726867341.84219: variable 'interface' from source: set_fact 18662 1726867341.84237: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18662 1726867341.84240: when evaluation is False, skipping this task 18662 1726867341.84243: _execute() done 18662 1726867341.84245: dumping result to json 18662 1726867341.84248: done dumping result, returning 18662 1726867341.84254: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-efab-a8ce-000000000063] 18662 1726867341.84264: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000063 18662 1726867341.84339: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000063 18662 1726867341.84341: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18662 1726867341.84389: no more pending results, returning what we have 18662 1726867341.84392: results queue empty 18662 1726867341.84393: checking for any_errors_fatal 18662 1726867341.84398: done checking for any_errors_fatal 18662 1726867341.84399: checking for max_fail_percentage 18662 1726867341.84401: done checking for max_fail_percentage 18662 1726867341.84402: checking to see if all hosts have failed and the running result is not ok 18662 1726867341.84402: done checking to see if all hosts have failed 18662 1726867341.84403: getting the remaining hosts for this loop 18662 1726867341.84404: done getting the remaining hosts for this loop 18662 1726867341.84408: getting the next task for host managed_node2 18662 1726867341.84414: done getting next task for host managed_node2 18662 1726867341.84417: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 18662 1726867341.84419: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867341.84431: getting variables 18662 1726867341.84432: in VariableManager get_vars() 18662 1726867341.84468: Calling all_inventory to load vars for managed_node2 18662 1726867341.84470: Calling groups_inventory to load vars for managed_node2 18662 1726867341.84472: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867341.84487: Calling all_plugins_play to load vars for managed_node2 18662 1726867341.84490: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867341.84493: Calling groups_plugins_play to load vars for managed_node2 18662 1726867341.85725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867341.87066: done with get_vars() 18662 1726867341.87083: done getting variables 18662 1726867341.87123: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:22:21 -0400 (0:00:00.069) 0:00:36.506 ****** 18662 1726867341.87146: entering _queue_task() for managed_node2/package 18662 1726867341.87355: worker is 1 (out of 1 available) 18662 1726867341.87368: exiting _queue_task() for managed_node2/package 18662 1726867341.87381: done queuing things up, now waiting for results queue to drain 18662 1726867341.87382: waiting for pending results... 18662 1726867341.87547: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 18662 1726867341.87613: in run() - task 0affcac9-a3a5-efab-a8ce-000000000064 18662 1726867341.87626: variable 'ansible_search_path' from source: unknown 18662 1726867341.87630: variable 'ansible_search_path' from source: unknown 18662 1726867341.87657: calling self._execute() 18662 1726867341.87729: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867341.87735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867341.87744: variable 'omit' from source: magic vars 18662 1726867341.88001: variable 'ansible_distribution_major_version' from source: facts 18662 1726867341.88010: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867341.88142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867341.88331: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867341.88365: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867341.88391: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867341.88444: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867341.88524: variable 'network_packages' from source: role '' defaults 18662 1726867341.88682: variable '__network_provider_setup' from source: role '' defaults 18662 1726867341.88685: variable '__network_service_name_default_nm' from source: role '' defaults 18662 1726867341.88688: variable '__network_service_name_default_nm' from source: role '' defaults 18662 1726867341.88690: variable '__network_packages_default_nm' from source: role '' defaults 18662 1726867341.88702: variable '__network_packages_default_nm' from source: role '' defaults 18662 1726867341.88814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867341.90090: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867341.90136: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867341.90162: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867341.90186: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867341.90206: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867341.90265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.90286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.90303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.90333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.90344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.90374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.90392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.90408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.90434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.90449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.90592: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18662 1726867341.90663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.90681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.90697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.90724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.90734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.90795: variable 'ansible_python' from source: facts 18662 1726867341.90816: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18662 1726867341.90872: variable '__network_wpa_supplicant_required' from source: role '' defaults 18662 1726867341.90925: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18662 1726867341.91017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.91034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.91050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.91074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.91088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.91122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867341.91141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867341.91157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.91188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867341.91200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867341.91291: variable 'network_connections' from source: play vars 18662 1726867341.91294: variable 'profile' from source: play vars 18662 1726867341.91367: variable 'profile' from source: play vars 18662 1726867341.91371: variable 'interface' from source: set_fact 18662 1726867341.91424: variable 'interface' from source: set_fact 18662 1726867341.91472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867341.91492: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867341.91516: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867341.91539: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867341.91572: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867341.91752: variable 'network_connections' from source: play vars 18662 1726867341.91756: variable 'profile' from source: play vars 18662 1726867341.91826: variable 'profile' from source: play vars 18662 1726867341.91831: variable 'interface' from source: set_fact 18662 1726867341.91882: variable 'interface' from source: set_fact 18662 1726867341.91905: variable '__network_packages_default_wireless' from source: role '' defaults 18662 1726867341.91962: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867341.92149: variable 'network_connections' from source: play vars 18662 1726867341.92152: variable 'profile' from source: play vars 18662 1726867341.92201: variable 'profile' from source: play vars 18662 1726867341.92205: variable 'interface' from source: set_fact 18662 1726867341.92272: variable 'interface' from source: set_fact 18662 1726867341.92293: variable '__network_packages_default_team' from source: role '' defaults 18662 1726867341.92348: variable '__network_team_connections_defined' from source: role '' defaults 18662 1726867341.92540: variable 'network_connections' from source: play vars 18662 1726867341.92543: variable 'profile' from source: play vars 18662 1726867341.92592: variable 'profile' from source: play vars 18662 1726867341.92596: variable 'interface' from source: set_fact 18662 1726867341.92667: variable 'interface' from source: set_fact 18662 1726867341.92706: variable '__network_service_name_default_initscripts' from source: role '' defaults 18662 1726867341.92751: variable '__network_service_name_default_initscripts' from source: role '' defaults 18662 1726867341.92757: variable '__network_packages_default_initscripts' from source: role '' defaults 18662 1726867341.92800: variable '__network_packages_default_initscripts' from source: role '' defaults 18662 1726867341.92939: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18662 1726867341.93228: variable 'network_connections' from source: play vars 18662 1726867341.93231: variable 'profile' from source: play vars 18662 1726867341.93276: variable 'profile' from source: play vars 18662 1726867341.93281: variable 'interface' from source: set_fact 18662 1726867341.93327: variable 'interface' from source: set_fact 18662 1726867341.93333: variable 'ansible_distribution' from source: facts 18662 1726867341.93336: variable '__network_rh_distros' from source: role '' defaults 18662 1726867341.93342: variable 'ansible_distribution_major_version' from source: facts 18662 1726867341.93352: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18662 1726867341.93460: variable 'ansible_distribution' from source: facts 18662 1726867341.93463: variable '__network_rh_distros' from source: role '' defaults 18662 1726867341.93466: variable 'ansible_distribution_major_version' from source: facts 18662 1726867341.93484: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18662 1726867341.93582: variable 'ansible_distribution' from source: facts 18662 1726867341.93587: variable '__network_rh_distros' from source: role '' defaults 18662 1726867341.93589: variable 'ansible_distribution_major_version' from source: facts 18662 1726867341.93618: variable 'network_provider' from source: set_fact 18662 1726867341.93629: variable 'ansible_facts' from source: unknown 18662 1726867341.94158: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 18662 1726867341.94162: when evaluation is False, skipping this task 18662 1726867341.94164: _execute() done 18662 1726867341.94166: dumping result to json 18662 1726867341.94168: done dumping result, returning 18662 1726867341.94175: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-efab-a8ce-000000000064] 18662 1726867341.94180: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000064 18662 1726867341.94263: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000064 18662 1726867341.94266: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 18662 1726867341.94317: no more pending results, returning what we have 18662 1726867341.94322: results queue empty 18662 1726867341.94322: checking for any_errors_fatal 18662 1726867341.94329: done checking for any_errors_fatal 18662 1726867341.94330: checking for max_fail_percentage 18662 1726867341.94332: done checking for max_fail_percentage 18662 1726867341.94332: checking to see if all hosts have failed and the running result is not ok 18662 1726867341.94333: done checking to see if all hosts have failed 18662 1726867341.94333: getting the remaining hosts for this loop 18662 1726867341.94335: done getting the remaining hosts for this loop 18662 1726867341.94338: getting the next task for host managed_node2 18662 1726867341.94344: done getting next task for host managed_node2 18662 1726867341.94347: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18662 1726867341.94349: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867341.94361: getting variables 18662 1726867341.94363: in VariableManager get_vars() 18662 1726867341.94404: Calling all_inventory to load vars for managed_node2 18662 1726867341.94407: Calling groups_inventory to load vars for managed_node2 18662 1726867341.94409: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867341.94423: Calling all_plugins_play to load vars for managed_node2 18662 1726867341.94426: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867341.94429: Calling groups_plugins_play to load vars for managed_node2 18662 1726867341.95223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867341.96189: done with get_vars() 18662 1726867341.96204: done getting variables 18662 1726867341.96246: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:22:21 -0400 (0:00:00.091) 0:00:36.598 ****** 18662 1726867341.96267: entering _queue_task() for managed_node2/package 18662 1726867341.96476: worker is 1 (out of 1 available) 18662 1726867341.96491: exiting _queue_task() for managed_node2/package 18662 1726867341.96504: done queuing things up, now waiting for results queue to drain 18662 1726867341.96505: waiting for pending results... 18662 1726867341.96666: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 18662 1726867341.96734: in run() - task 0affcac9-a3a5-efab-a8ce-000000000065 18662 1726867341.96746: variable 'ansible_search_path' from source: unknown 18662 1726867341.96750: variable 'ansible_search_path' from source: unknown 18662 1726867341.96778: calling self._execute() 18662 1726867341.96859: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867341.96863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867341.96873: variable 'omit' from source: magic vars 18662 1726867341.97142: variable 'ansible_distribution_major_version' from source: facts 18662 1726867341.97151: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867341.97235: variable 'network_state' from source: role '' defaults 18662 1726867341.97243: Evaluated conditional (network_state != {}): False 18662 1726867341.97246: when evaluation is False, skipping this task 18662 1726867341.97249: _execute() done 18662 1726867341.97252: dumping result to json 18662 1726867341.97255: done dumping result, returning 18662 1726867341.97262: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-efab-a8ce-000000000065] 18662 1726867341.97265: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000065 18662 1726867341.97354: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000065 18662 1726867341.97357: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18662 1726867341.97422: no more pending results, returning what we have 18662 1726867341.97425: results queue empty 18662 1726867341.97426: checking for any_errors_fatal 18662 1726867341.97430: done checking for any_errors_fatal 18662 1726867341.97431: checking for max_fail_percentage 18662 1726867341.97432: done checking for max_fail_percentage 18662 1726867341.97433: checking to see if all hosts have failed and the running result is not ok 18662 1726867341.97434: done checking to see if all hosts have failed 18662 1726867341.97434: getting the remaining hosts for this loop 18662 1726867341.97435: done getting the remaining hosts for this loop 18662 1726867341.97438: getting the next task for host managed_node2 18662 1726867341.97443: done getting next task for host managed_node2 18662 1726867341.97446: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18662 1726867341.97447: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867341.97459: getting variables 18662 1726867341.97460: in VariableManager get_vars() 18662 1726867341.97491: Calling all_inventory to load vars for managed_node2 18662 1726867341.97493: Calling groups_inventory to load vars for managed_node2 18662 1726867341.97495: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867341.97503: Calling all_plugins_play to load vars for managed_node2 18662 1726867341.97506: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867341.97510: Calling groups_plugins_play to load vars for managed_node2 18662 1726867341.98226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867341.99095: done with get_vars() 18662 1726867341.99112: done getting variables 18662 1726867341.99152: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:22:21 -0400 (0:00:00.029) 0:00:36.627 ****** 18662 1726867341.99171: entering _queue_task() for managed_node2/package 18662 1726867341.99359: worker is 1 (out of 1 available) 18662 1726867341.99373: exiting _queue_task() for managed_node2/package 18662 1726867341.99386: done queuing things up, now waiting for results queue to drain 18662 1726867341.99387: waiting for pending results... 18662 1726867341.99542: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 18662 1726867341.99607: in run() - task 0affcac9-a3a5-efab-a8ce-000000000066 18662 1726867341.99619: variable 'ansible_search_path' from source: unknown 18662 1726867341.99622: variable 'ansible_search_path' from source: unknown 18662 1726867341.99647: calling self._execute() 18662 1726867341.99716: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867341.99724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867341.99729: variable 'omit' from source: magic vars 18662 1726867341.99972: variable 'ansible_distribution_major_version' from source: facts 18662 1726867341.99982: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867342.00063: variable 'network_state' from source: role '' defaults 18662 1726867342.00071: Evaluated conditional (network_state != {}): False 18662 1726867342.00074: when evaluation is False, skipping this task 18662 1726867342.00078: _execute() done 18662 1726867342.00081: dumping result to json 18662 1726867342.00083: done dumping result, returning 18662 1726867342.00092: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-efab-a8ce-000000000066] 18662 1726867342.00096: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000066 18662 1726867342.00185: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000066 18662 1726867342.00188: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18662 1726867342.00234: no more pending results, returning what we have 18662 1726867342.00237: results queue empty 18662 1726867342.00238: checking for any_errors_fatal 18662 1726867342.00244: done checking for any_errors_fatal 18662 1726867342.00245: checking for max_fail_percentage 18662 1726867342.00247: done checking for max_fail_percentage 18662 1726867342.00247: checking to see if all hosts have failed and the running result is not ok 18662 1726867342.00248: done checking to see if all hosts have failed 18662 1726867342.00249: getting the remaining hosts for this loop 18662 1726867342.00250: done getting the remaining hosts for this loop 18662 1726867342.00252: getting the next task for host managed_node2 18662 1726867342.00257: done getting next task for host managed_node2 18662 1726867342.00260: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18662 1726867342.00262: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867342.00273: getting variables 18662 1726867342.00274: in VariableManager get_vars() 18662 1726867342.00313: Calling all_inventory to load vars for managed_node2 18662 1726867342.00315: Calling groups_inventory to load vars for managed_node2 18662 1726867342.00317: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867342.00325: Calling all_plugins_play to load vars for managed_node2 18662 1726867342.00327: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867342.00329: Calling groups_plugins_play to load vars for managed_node2 18662 1726867342.01175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867342.02041: done with get_vars() 18662 1726867342.02056: done getting variables 18662 1726867342.02096: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:22:22 -0400 (0:00:00.029) 0:00:36.656 ****** 18662 1726867342.02118: entering _queue_task() for managed_node2/service 18662 1726867342.02302: worker is 1 (out of 1 available) 18662 1726867342.02317: exiting _queue_task() for managed_node2/service 18662 1726867342.02328: done queuing things up, now waiting for results queue to drain 18662 1726867342.02329: waiting for pending results... 18662 1726867342.02485: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 18662 1726867342.02551: in run() - task 0affcac9-a3a5-efab-a8ce-000000000067 18662 1726867342.02558: variable 'ansible_search_path' from source: unknown 18662 1726867342.02561: variable 'ansible_search_path' from source: unknown 18662 1726867342.02590: calling self._execute() 18662 1726867342.02656: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867342.02661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867342.02669: variable 'omit' from source: magic vars 18662 1726867342.02917: variable 'ansible_distribution_major_version' from source: facts 18662 1726867342.02928: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867342.03004: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867342.03129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867342.04551: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867342.04594: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867342.04622: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867342.04648: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867342.04669: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867342.04780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867342.04784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867342.04786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867342.04801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867342.04814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867342.04847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867342.04864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867342.04884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867342.04907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867342.04919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867342.04950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867342.04966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867342.04983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867342.05011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867342.05020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867342.05122: variable 'network_connections' from source: play vars 18662 1726867342.05131: variable 'profile' from source: play vars 18662 1726867342.05180: variable 'profile' from source: play vars 18662 1726867342.05183: variable 'interface' from source: set_fact 18662 1726867342.05227: variable 'interface' from source: set_fact 18662 1726867342.05278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867342.05381: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867342.05408: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867342.05432: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867342.05453: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867342.05484: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867342.05500: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867342.05518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867342.05536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867342.05572: variable '__network_team_connections_defined' from source: role '' defaults 18662 1726867342.05722: variable 'network_connections' from source: play vars 18662 1726867342.05725: variable 'profile' from source: play vars 18662 1726867342.05769: variable 'profile' from source: play vars 18662 1726867342.05773: variable 'interface' from source: set_fact 18662 1726867342.05816: variable 'interface' from source: set_fact 18662 1726867342.05834: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 18662 1726867342.05837: when evaluation is False, skipping this task 18662 1726867342.05840: _execute() done 18662 1726867342.05842: dumping result to json 18662 1726867342.05844: done dumping result, returning 18662 1726867342.05850: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-efab-a8ce-000000000067] 18662 1726867342.05862: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000067 18662 1726867342.05938: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000067 18662 1726867342.05941: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 18662 1726867342.05985: no more pending results, returning what we have 18662 1726867342.05988: results queue empty 18662 1726867342.05989: checking for any_errors_fatal 18662 1726867342.05996: done checking for any_errors_fatal 18662 1726867342.05997: checking for max_fail_percentage 18662 1726867342.05999: done checking for max_fail_percentage 18662 1726867342.05999: checking to see if all hosts have failed and the running result is not ok 18662 1726867342.06000: done checking to see if all hosts have failed 18662 1726867342.06001: getting the remaining hosts for this loop 18662 1726867342.06002: done getting the remaining hosts for this loop 18662 1726867342.06005: getting the next task for host managed_node2 18662 1726867342.06013: done getting next task for host managed_node2 18662 1726867342.06016: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18662 1726867342.06018: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867342.06029: getting variables 18662 1726867342.06030: in VariableManager get_vars() 18662 1726867342.06060: Calling all_inventory to load vars for managed_node2 18662 1726867342.06062: Calling groups_inventory to load vars for managed_node2 18662 1726867342.06064: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867342.06071: Calling all_plugins_play to load vars for managed_node2 18662 1726867342.06074: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867342.06076: Calling groups_plugins_play to load vars for managed_node2 18662 1726867342.06823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867342.07712: done with get_vars() 18662 1726867342.07728: done getting variables 18662 1726867342.07768: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:22:22 -0400 (0:00:00.056) 0:00:36.713 ****** 18662 1726867342.07790: entering _queue_task() for managed_node2/service 18662 1726867342.08015: worker is 1 (out of 1 available) 18662 1726867342.08027: exiting _queue_task() for managed_node2/service 18662 1726867342.08039: done queuing things up, now waiting for results queue to drain 18662 1726867342.08040: waiting for pending results... 18662 1726867342.08202: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 18662 1726867342.08265: in run() - task 0affcac9-a3a5-efab-a8ce-000000000068 18662 1726867342.08276: variable 'ansible_search_path' from source: unknown 18662 1726867342.08282: variable 'ansible_search_path' from source: unknown 18662 1726867342.08312: calling self._execute() 18662 1726867342.08381: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867342.08385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867342.08396: variable 'omit' from source: magic vars 18662 1726867342.08658: variable 'ansible_distribution_major_version' from source: facts 18662 1726867342.08668: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867342.08778: variable 'network_provider' from source: set_fact 18662 1726867342.08782: variable 'network_state' from source: role '' defaults 18662 1726867342.08792: Evaluated conditional (network_provider == "nm" or network_state != {}): True 18662 1726867342.08797: variable 'omit' from source: magic vars 18662 1726867342.08828: variable 'omit' from source: magic vars 18662 1726867342.08847: variable 'network_service_name' from source: role '' defaults 18662 1726867342.08900: variable 'network_service_name' from source: role '' defaults 18662 1726867342.08970: variable '__network_provider_setup' from source: role '' defaults 18662 1726867342.08973: variable '__network_service_name_default_nm' from source: role '' defaults 18662 1726867342.09019: variable '__network_service_name_default_nm' from source: role '' defaults 18662 1726867342.09027: variable '__network_packages_default_nm' from source: role '' defaults 18662 1726867342.09074: variable '__network_packages_default_nm' from source: role '' defaults 18662 1726867342.09216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867342.10821: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867342.10862: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867342.10895: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867342.10918: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867342.10940: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867342.10997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867342.11020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867342.11038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867342.11064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867342.11074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867342.11111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867342.11127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867342.11143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867342.11167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867342.11179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867342.11320: variable '__network_packages_default_gobject_packages' from source: role '' defaults 18662 1726867342.11393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867342.11413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867342.11428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867342.11455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867342.11465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867342.11525: variable 'ansible_python' from source: facts 18662 1726867342.11539: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 18662 1726867342.11596: variable '__network_wpa_supplicant_required' from source: role '' defaults 18662 1726867342.11648: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18662 1726867342.11731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867342.11748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867342.11766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867342.11794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867342.11804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867342.11837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867342.11856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867342.11872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867342.11901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867342.11914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867342.11997: variable 'network_connections' from source: play vars 18662 1726867342.12003: variable 'profile' from source: play vars 18662 1726867342.12053: variable 'profile' from source: play vars 18662 1726867342.12057: variable 'interface' from source: set_fact 18662 1726867342.12102: variable 'interface' from source: set_fact 18662 1726867342.12168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867342.12282: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867342.12320: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867342.12347: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867342.12375: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867342.12417: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867342.12443: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867342.12466: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867342.12490: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867342.12524: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867342.12701: variable 'network_connections' from source: play vars 18662 1726867342.12705: variable 'profile' from source: play vars 18662 1726867342.12758: variable 'profile' from source: play vars 18662 1726867342.12763: variable 'interface' from source: set_fact 18662 1726867342.12805: variable 'interface' from source: set_fact 18662 1726867342.12830: variable '__network_packages_default_wireless' from source: role '' defaults 18662 1726867342.12885: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867342.13062: variable 'network_connections' from source: play vars 18662 1726867342.13065: variable 'profile' from source: play vars 18662 1726867342.13118: variable 'profile' from source: play vars 18662 1726867342.13121: variable 'interface' from source: set_fact 18662 1726867342.13170: variable 'interface' from source: set_fact 18662 1726867342.13192: variable '__network_packages_default_team' from source: role '' defaults 18662 1726867342.13243: variable '__network_team_connections_defined' from source: role '' defaults 18662 1726867342.13422: variable 'network_connections' from source: play vars 18662 1726867342.13425: variable 'profile' from source: play vars 18662 1726867342.13473: variable 'profile' from source: play vars 18662 1726867342.13476: variable 'interface' from source: set_fact 18662 1726867342.13531: variable 'interface' from source: set_fact 18662 1726867342.13566: variable '__network_service_name_default_initscripts' from source: role '' defaults 18662 1726867342.13613: variable '__network_service_name_default_initscripts' from source: role '' defaults 18662 1726867342.13617: variable '__network_packages_default_initscripts' from source: role '' defaults 18662 1726867342.13658: variable '__network_packages_default_initscripts' from source: role '' defaults 18662 1726867342.13791: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 18662 1726867342.14087: variable 'network_connections' from source: play vars 18662 1726867342.14090: variable 'profile' from source: play vars 18662 1726867342.14133: variable 'profile' from source: play vars 18662 1726867342.14137: variable 'interface' from source: set_fact 18662 1726867342.14187: variable 'interface' from source: set_fact 18662 1726867342.14194: variable 'ansible_distribution' from source: facts 18662 1726867342.14197: variable '__network_rh_distros' from source: role '' defaults 18662 1726867342.14202: variable 'ansible_distribution_major_version' from source: facts 18662 1726867342.14213: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 18662 1726867342.14324: variable 'ansible_distribution' from source: facts 18662 1726867342.14328: variable '__network_rh_distros' from source: role '' defaults 18662 1726867342.14331: variable 'ansible_distribution_major_version' from source: facts 18662 1726867342.14342: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 18662 1726867342.14451: variable 'ansible_distribution' from source: facts 18662 1726867342.14456: variable '__network_rh_distros' from source: role '' defaults 18662 1726867342.14459: variable 'ansible_distribution_major_version' from source: facts 18662 1726867342.14487: variable 'network_provider' from source: set_fact 18662 1726867342.14503: variable 'omit' from source: magic vars 18662 1726867342.14522: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867342.14541: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867342.14555: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867342.14567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867342.14575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867342.14602: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867342.14606: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867342.14608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867342.14668: Set connection var ansible_timeout to 10 18662 1726867342.14671: Set connection var ansible_connection to ssh 18662 1726867342.14674: Set connection var ansible_shell_executable to /bin/sh 18662 1726867342.14679: Set connection var ansible_shell_type to sh 18662 1726867342.14687: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867342.14692: Set connection var ansible_pipelining to False 18662 1726867342.14712: variable 'ansible_shell_executable' from source: unknown 18662 1726867342.14715: variable 'ansible_connection' from source: unknown 18662 1726867342.14718: variable 'ansible_module_compression' from source: unknown 18662 1726867342.14722: variable 'ansible_shell_type' from source: unknown 18662 1726867342.14724: variable 'ansible_shell_executable' from source: unknown 18662 1726867342.14726: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867342.14734: variable 'ansible_pipelining' from source: unknown 18662 1726867342.14737: variable 'ansible_timeout' from source: unknown 18662 1726867342.14739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867342.14799: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867342.14807: variable 'omit' from source: magic vars 18662 1726867342.14814: starting attempt loop 18662 1726867342.14817: running the handler 18662 1726867342.14871: variable 'ansible_facts' from source: unknown 18662 1726867342.15245: _low_level_execute_command(): starting 18662 1726867342.15252: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867342.15747: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867342.15751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867342.15753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18662 1726867342.15755: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867342.15758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867342.15796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867342.15812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867342.15871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867342.17544: stdout chunk (state=3): >>>/root <<< 18662 1726867342.17646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867342.17673: stderr chunk (state=3): >>><<< 18662 1726867342.17678: stdout chunk (state=3): >>><<< 18662 1726867342.17695: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867342.17704: _low_level_execute_command(): starting 18662 1726867342.17709: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867342.176945-20424-38778050650395 `" && echo ansible-tmp-1726867342.176945-20424-38778050650395="` echo /root/.ansible/tmp/ansible-tmp-1726867342.176945-20424-38778050650395 `" ) && sleep 0' 18662 1726867342.18123: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867342.18127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867342.18129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867342.18131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 18662 1726867342.18133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867342.18174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867342.18179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867342.18224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867342.20181: stdout chunk (state=3): >>>ansible-tmp-1726867342.176945-20424-38778050650395=/root/.ansible/tmp/ansible-tmp-1726867342.176945-20424-38778050650395 <<< 18662 1726867342.20292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867342.20315: stderr chunk (state=3): >>><<< 18662 1726867342.20320: stdout chunk (state=3): >>><<< 18662 1726867342.20331: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867342.176945-20424-38778050650395=/root/.ansible/tmp/ansible-tmp-1726867342.176945-20424-38778050650395 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867342.20353: variable 'ansible_module_compression' from source: unknown 18662 1726867342.20392: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 18662 1726867342.20443: variable 'ansible_facts' from source: unknown 18662 1726867342.20583: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867342.176945-20424-38778050650395/AnsiballZ_systemd.py 18662 1726867342.20671: Sending initial data 18662 1726867342.20674: Sent initial data (154 bytes) 18662 1726867342.21101: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867342.21104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867342.21106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18662 1726867342.21108: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867342.21110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867342.21159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867342.21162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867342.21213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867342.22894: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 18662 1726867342.22897: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867342.22936: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867342.22976: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpwmo41h86 /root/.ansible/tmp/ansible-tmp-1726867342.176945-20424-38778050650395/AnsiballZ_systemd.py <<< 18662 1726867342.22980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867342.176945-20424-38778050650395/AnsiballZ_systemd.py" <<< 18662 1726867342.23016: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpwmo41h86" to remote "/root/.ansible/tmp/ansible-tmp-1726867342.176945-20424-38778050650395/AnsiballZ_systemd.py" <<< 18662 1726867342.23019: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867342.176945-20424-38778050650395/AnsiballZ_systemd.py" <<< 18662 1726867342.24065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867342.24098: stderr chunk (state=3): >>><<< 18662 1726867342.24102: stdout chunk (state=3): >>><<< 18662 1726867342.24141: done transferring module to remote 18662 1726867342.24148: _low_level_execute_command(): starting 18662 1726867342.24151: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867342.176945-20424-38778050650395/ /root/.ansible/tmp/ansible-tmp-1726867342.176945-20424-38778050650395/AnsiballZ_systemd.py && sleep 0' 18662 1726867342.24545: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867342.24548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867342.24550: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867342.24552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 18662 1726867342.24554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867342.24602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867342.24605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867342.24651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867342.26558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867342.26580: stderr chunk (state=3): >>><<< 18662 1726867342.26583: stdout chunk (state=3): >>><<< 18662 1726867342.26594: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867342.26597: _low_level_execute_command(): starting 18662 1726867342.26602: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867342.176945-20424-38778050650395/AnsiballZ_systemd.py && sleep 0' 18662 1726867342.26982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867342.26986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867342.26998: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867342.27055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867342.27060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867342.27106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867342.57010: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4493312", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3308777472", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "898494000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 18662 1726867342.57032: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 18662 1726867342.58976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867342.58983: stdout chunk (state=3): >>><<< 18662 1726867342.58985: stderr chunk (state=3): >>><<< 18662 1726867342.59184: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4493312", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3308777472", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "898494000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867342.59224: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867342.176945-20424-38778050650395/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867342.59250: _low_level_execute_command(): starting 18662 1726867342.59260: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867342.176945-20424-38778050650395/ > /dev/null 2>&1 && sleep 0' 18662 1726867342.59918: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867342.59933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867342.59947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867342.59972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867342.59990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867342.60001: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867342.60085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867342.60115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867342.60137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867342.60150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867342.60224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867342.62191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867342.62201: stdout chunk (state=3): >>><<< 18662 1726867342.62215: stderr chunk (state=3): >>><<< 18662 1726867342.62232: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867342.62244: handler run complete 18662 1726867342.62320: attempt loop complete, returning result 18662 1726867342.62328: _execute() done 18662 1726867342.62336: dumping result to json 18662 1726867342.62359: done dumping result, returning 18662 1726867342.62381: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-efab-a8ce-000000000068] 18662 1726867342.62384: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000068 18662 1726867342.62905: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000068 18662 1726867342.62911: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18662 1726867342.62962: no more pending results, returning what we have 18662 1726867342.62965: results queue empty 18662 1726867342.62966: checking for any_errors_fatal 18662 1726867342.62973: done checking for any_errors_fatal 18662 1726867342.62974: checking for max_fail_percentage 18662 1726867342.62976: done checking for max_fail_percentage 18662 1726867342.62978: checking to see if all hosts have failed and the running result is not ok 18662 1726867342.62979: done checking to see if all hosts have failed 18662 1726867342.62980: getting the remaining hosts for this loop 18662 1726867342.62981: done getting the remaining hosts for this loop 18662 1726867342.62985: getting the next task for host managed_node2 18662 1726867342.62991: done getting next task for host managed_node2 18662 1726867342.62995: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18662 1726867342.62997: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867342.63007: getting variables 18662 1726867342.63012: in VariableManager get_vars() 18662 1726867342.63146: Calling all_inventory to load vars for managed_node2 18662 1726867342.63149: Calling groups_inventory to load vars for managed_node2 18662 1726867342.63151: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867342.63161: Calling all_plugins_play to load vars for managed_node2 18662 1726867342.63164: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867342.63167: Calling groups_plugins_play to load vars for managed_node2 18662 1726867342.64801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867342.66499: done with get_vars() 18662 1726867342.66523: done getting variables 18662 1726867342.66588: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:22:22 -0400 (0:00:00.588) 0:00:37.301 ****** 18662 1726867342.66623: entering _queue_task() for managed_node2/service 18662 1726867342.66949: worker is 1 (out of 1 available) 18662 1726867342.66960: exiting _queue_task() for managed_node2/service 18662 1726867342.66972: done queuing things up, now waiting for results queue to drain 18662 1726867342.66973: waiting for pending results... 18662 1726867342.67248: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 18662 1726867342.67417: in run() - task 0affcac9-a3a5-efab-a8ce-000000000069 18662 1726867342.67422: variable 'ansible_search_path' from source: unknown 18662 1726867342.67425: variable 'ansible_search_path' from source: unknown 18662 1726867342.67444: calling self._execute() 18662 1726867342.67551: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867342.67563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867342.67576: variable 'omit' from source: magic vars 18662 1726867342.67979: variable 'ansible_distribution_major_version' from source: facts 18662 1726867342.68067: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867342.68132: variable 'network_provider' from source: set_fact 18662 1726867342.68143: Evaluated conditional (network_provider == "nm"): True 18662 1726867342.68248: variable '__network_wpa_supplicant_required' from source: role '' defaults 18662 1726867342.68349: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 18662 1726867342.68537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867342.70734: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867342.70811: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867342.70852: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867342.70899: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867342.70938: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867342.71083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867342.71086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867342.71116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867342.71168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867342.71282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867342.71286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867342.71288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867342.71308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867342.71355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867342.71376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867342.71435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867342.71464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867342.71496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867342.71625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867342.71629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867342.71728: variable 'network_connections' from source: play vars 18662 1726867342.71845: variable 'profile' from source: play vars 18662 1726867342.71849: variable 'profile' from source: play vars 18662 1726867342.71852: variable 'interface' from source: set_fact 18662 1726867342.71911: variable 'interface' from source: set_fact 18662 1726867342.71997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 18662 1726867342.72181: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 18662 1726867342.72225: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 18662 1726867342.72260: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 18662 1726867342.72304: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 18662 1726867342.72352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 18662 1726867342.72383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 18662 1726867342.72482: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867342.72487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 18662 1726867342.72513: variable '__network_wireless_connections_defined' from source: role '' defaults 18662 1726867342.72784: variable 'network_connections' from source: play vars 18662 1726867342.72795: variable 'profile' from source: play vars 18662 1726867342.72869: variable 'profile' from source: play vars 18662 1726867342.72927: variable 'interface' from source: set_fact 18662 1726867342.72954: variable 'interface' from source: set_fact 18662 1726867342.72989: Evaluated conditional (__network_wpa_supplicant_required): False 18662 1726867342.72997: when evaluation is False, skipping this task 18662 1726867342.73004: _execute() done 18662 1726867342.73023: dumping result to json 18662 1726867342.73034: done dumping result, returning 18662 1726867342.73050: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-efab-a8ce-000000000069] 18662 1726867342.73145: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000069 18662 1726867342.73218: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000069 18662 1726867342.73221: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 18662 1726867342.73296: no more pending results, returning what we have 18662 1726867342.73300: results queue empty 18662 1726867342.73302: checking for any_errors_fatal 18662 1726867342.73322: done checking for any_errors_fatal 18662 1726867342.73323: checking for max_fail_percentage 18662 1726867342.73325: done checking for max_fail_percentage 18662 1726867342.73326: checking to see if all hosts have failed and the running result is not ok 18662 1726867342.73327: done checking to see if all hosts have failed 18662 1726867342.73327: getting the remaining hosts for this loop 18662 1726867342.73329: done getting the remaining hosts for this loop 18662 1726867342.73486: getting the next task for host managed_node2 18662 1726867342.73493: done getting next task for host managed_node2 18662 1726867342.73497: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 18662 1726867342.73499: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867342.73515: getting variables 18662 1726867342.73517: in VariableManager get_vars() 18662 1726867342.73555: Calling all_inventory to load vars for managed_node2 18662 1726867342.73557: Calling groups_inventory to load vars for managed_node2 18662 1726867342.73560: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867342.73571: Calling all_plugins_play to load vars for managed_node2 18662 1726867342.73574: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867342.73582: Calling groups_plugins_play to load vars for managed_node2 18662 1726867342.75136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867342.76954: done with get_vars() 18662 1726867342.76975: done getting variables 18662 1726867342.77038: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:22:22 -0400 (0:00:00.104) 0:00:37.406 ****** 18662 1726867342.77073: entering _queue_task() for managed_node2/service 18662 1726867342.77590: worker is 1 (out of 1 available) 18662 1726867342.77600: exiting _queue_task() for managed_node2/service 18662 1726867342.77612: done queuing things up, now waiting for results queue to drain 18662 1726867342.77614: waiting for pending results... 18662 1726867342.77795: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 18662 1726867342.77819: in run() - task 0affcac9-a3a5-efab-a8ce-00000000006a 18662 1726867342.77893: variable 'ansible_search_path' from source: unknown 18662 1726867342.77896: variable 'ansible_search_path' from source: unknown 18662 1726867342.77899: calling self._execute() 18662 1726867342.77993: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867342.78014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867342.78029: variable 'omit' from source: magic vars 18662 1726867342.78425: variable 'ansible_distribution_major_version' from source: facts 18662 1726867342.78446: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867342.78566: variable 'network_provider' from source: set_fact 18662 1726867342.78600: Evaluated conditional (network_provider == "initscripts"): False 18662 1726867342.78603: when evaluation is False, skipping this task 18662 1726867342.78606: _execute() done 18662 1726867342.78607: dumping result to json 18662 1726867342.78612: done dumping result, returning 18662 1726867342.78614: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-efab-a8ce-00000000006a] 18662 1726867342.78654: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000006a skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 18662 1726867342.78833: no more pending results, returning what we have 18662 1726867342.78837: results queue empty 18662 1726867342.78838: checking for any_errors_fatal 18662 1726867342.78849: done checking for any_errors_fatal 18662 1726867342.78850: checking for max_fail_percentage 18662 1726867342.78854: done checking for max_fail_percentage 18662 1726867342.78855: checking to see if all hosts have failed and the running result is not ok 18662 1726867342.78856: done checking to see if all hosts have failed 18662 1726867342.78856: getting the remaining hosts for this loop 18662 1726867342.78858: done getting the remaining hosts for this loop 18662 1726867342.78862: getting the next task for host managed_node2 18662 1726867342.78869: done getting next task for host managed_node2 18662 1726867342.78872: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18662 1726867342.78876: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867342.78893: getting variables 18662 1726867342.78895: in VariableManager get_vars() 18662 1726867342.78938: Calling all_inventory to load vars for managed_node2 18662 1726867342.78941: Calling groups_inventory to load vars for managed_node2 18662 1726867342.78943: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867342.78955: Calling all_plugins_play to load vars for managed_node2 18662 1726867342.78959: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867342.78961: Calling groups_plugins_play to load vars for managed_node2 18662 1726867342.79593: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000006a 18662 1726867342.79597: WORKER PROCESS EXITING 18662 1726867342.80744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867342.83294: done with get_vars() 18662 1726867342.83319: done getting variables 18662 1726867342.83375: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:22:22 -0400 (0:00:00.064) 0:00:37.470 ****** 18662 1726867342.83525: entering _queue_task() for managed_node2/copy 18662 1726867342.84129: worker is 1 (out of 1 available) 18662 1726867342.84255: exiting _queue_task() for managed_node2/copy 18662 1726867342.84266: done queuing things up, now waiting for results queue to drain 18662 1726867342.84268: waiting for pending results... 18662 1726867342.84644: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 18662 1726867342.84732: in run() - task 0affcac9-a3a5-efab-a8ce-00000000006b 18662 1726867342.85183: variable 'ansible_search_path' from source: unknown 18662 1726867342.85188: variable 'ansible_search_path' from source: unknown 18662 1726867342.85191: calling self._execute() 18662 1726867342.85194: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867342.85197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867342.85199: variable 'omit' from source: magic vars 18662 1726867342.85742: variable 'ansible_distribution_major_version' from source: facts 18662 1726867342.85969: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867342.86119: variable 'network_provider' from source: set_fact 18662 1726867342.86130: Evaluated conditional (network_provider == "initscripts"): False 18662 1726867342.86137: when evaluation is False, skipping this task 18662 1726867342.86144: _execute() done 18662 1726867342.86156: dumping result to json 18662 1726867342.86162: done dumping result, returning 18662 1726867342.86174: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-efab-a8ce-00000000006b] 18662 1726867342.86187: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000006b skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 18662 1726867342.86341: no more pending results, returning what we have 18662 1726867342.86345: results queue empty 18662 1726867342.86347: checking for any_errors_fatal 18662 1726867342.86353: done checking for any_errors_fatal 18662 1726867342.86354: checking for max_fail_percentage 18662 1726867342.86356: done checking for max_fail_percentage 18662 1726867342.86357: checking to see if all hosts have failed and the running result is not ok 18662 1726867342.86357: done checking to see if all hosts have failed 18662 1726867342.86358: getting the remaining hosts for this loop 18662 1726867342.86360: done getting the remaining hosts for this loop 18662 1726867342.86363: getting the next task for host managed_node2 18662 1726867342.86371: done getting next task for host managed_node2 18662 1726867342.86374: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18662 1726867342.86379: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867342.86392: getting variables 18662 1726867342.86394: in VariableManager get_vars() 18662 1726867342.86546: Calling all_inventory to load vars for managed_node2 18662 1726867342.86549: Calling groups_inventory to load vars for managed_node2 18662 1726867342.86551: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867342.86563: Calling all_plugins_play to load vars for managed_node2 18662 1726867342.86566: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867342.86570: Calling groups_plugins_play to load vars for managed_node2 18662 1726867342.86583: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000006b 18662 1726867342.86587: WORKER PROCESS EXITING 18662 1726867342.88209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867342.89778: done with get_vars() 18662 1726867342.89799: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:22:22 -0400 (0:00:00.063) 0:00:37.534 ****** 18662 1726867342.89874: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 18662 1726867342.90154: worker is 1 (out of 1 available) 18662 1726867342.90165: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 18662 1726867342.90176: done queuing things up, now waiting for results queue to drain 18662 1726867342.90380: waiting for pending results... 18662 1726867342.90434: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 18662 1726867342.90544: in run() - task 0affcac9-a3a5-efab-a8ce-00000000006c 18662 1726867342.90563: variable 'ansible_search_path' from source: unknown 18662 1726867342.90570: variable 'ansible_search_path' from source: unknown 18662 1726867342.90612: calling self._execute() 18662 1726867342.90752: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867342.90763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867342.90780: variable 'omit' from source: magic vars 18662 1726867342.91137: variable 'ansible_distribution_major_version' from source: facts 18662 1726867342.91158: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867342.91169: variable 'omit' from source: magic vars 18662 1726867342.91213: variable 'omit' from source: magic vars 18662 1726867342.91379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 18662 1726867342.93443: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 18662 1726867342.93512: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 18662 1726867342.93557: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 18662 1726867342.93645: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 18662 1726867342.93648: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 18662 1726867342.93716: variable 'network_provider' from source: set_fact 18662 1726867342.93850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 18662 1726867342.93902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 18662 1726867342.93931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 18662 1726867342.93981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 18662 1726867342.94001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 18662 1726867342.94081: variable 'omit' from source: magic vars 18662 1726867342.94382: variable 'omit' from source: magic vars 18662 1726867342.94385: variable 'network_connections' from source: play vars 18662 1726867342.94388: variable 'profile' from source: play vars 18662 1726867342.94390: variable 'profile' from source: play vars 18662 1726867342.94392: variable 'interface' from source: set_fact 18662 1726867342.94443: variable 'interface' from source: set_fact 18662 1726867342.94597: variable 'omit' from source: magic vars 18662 1726867342.94611: variable '__lsr_ansible_managed' from source: task vars 18662 1726867342.94675: variable '__lsr_ansible_managed' from source: task vars 18662 1726867342.95053: Loaded config def from plugin (lookup/template) 18662 1726867342.95057: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 18662 1726867342.95059: File lookup term: get_ansible_managed.j2 18662 1726867342.95062: variable 'ansible_search_path' from source: unknown 18662 1726867342.95064: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 18662 1726867342.95067: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 18662 1726867342.95070: variable 'ansible_search_path' from source: unknown 18662 1726867343.10899: variable 'ansible_managed' from source: unknown 18662 1726867343.11022: variable 'omit' from source: magic vars 18662 1726867343.11051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867343.11080: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867343.11104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867343.11122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867343.11133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867343.11152: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867343.11158: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867343.11164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867343.11255: Set connection var ansible_timeout to 10 18662 1726867343.11264: Set connection var ansible_connection to ssh 18662 1726867343.11274: Set connection var ansible_shell_executable to /bin/sh 18662 1726867343.11283: Set connection var ansible_shell_type to sh 18662 1726867343.11297: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867343.11314: Set connection var ansible_pipelining to False 18662 1726867343.11382: variable 'ansible_shell_executable' from source: unknown 18662 1726867343.11385: variable 'ansible_connection' from source: unknown 18662 1726867343.11388: variable 'ansible_module_compression' from source: unknown 18662 1726867343.11390: variable 'ansible_shell_type' from source: unknown 18662 1726867343.11392: variable 'ansible_shell_executable' from source: unknown 18662 1726867343.11394: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867343.11395: variable 'ansible_pipelining' from source: unknown 18662 1726867343.11397: variable 'ansible_timeout' from source: unknown 18662 1726867343.11399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867343.11502: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867343.11524: variable 'omit' from source: magic vars 18662 1726867343.11540: starting attempt loop 18662 1726867343.11641: running the handler 18662 1726867343.11645: _low_level_execute_command(): starting 18662 1726867343.11647: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867343.12179: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867343.12292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867343.12321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867343.12395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867343.14161: stdout chunk (state=3): >>>/root <<< 18662 1726867343.14285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867343.14307: stderr chunk (state=3): >>><<< 18662 1726867343.14332: stdout chunk (state=3): >>><<< 18662 1726867343.14355: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867343.14429: _low_level_execute_command(): starting 18662 1726867343.14433: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867343.1436586-20453-87341500229683 `" && echo ansible-tmp-1726867343.1436586-20453-87341500229683="` echo /root/.ansible/tmp/ansible-tmp-1726867343.1436586-20453-87341500229683 `" ) && sleep 0' 18662 1726867343.15030: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867343.15082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867343.15100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867343.15148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867343.15165: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867343.15192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867343.15272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867343.17321: stdout chunk (state=3): >>>ansible-tmp-1726867343.1436586-20453-87341500229683=/root/.ansible/tmp/ansible-tmp-1726867343.1436586-20453-87341500229683 <<< 18662 1726867343.17454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867343.17516: stderr chunk (state=3): >>><<< 18662 1726867343.17519: stdout chunk (state=3): >>><<< 18662 1726867343.17586: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867343.1436586-20453-87341500229683=/root/.ansible/tmp/ansible-tmp-1726867343.1436586-20453-87341500229683 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867343.17590: variable 'ansible_module_compression' from source: unknown 18662 1726867343.17634: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 18662 1726867343.17685: variable 'ansible_facts' from source: unknown 18662 1726867343.17831: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867343.1436586-20453-87341500229683/AnsiballZ_network_connections.py 18662 1726867343.17981: Sending initial data 18662 1726867343.18080: Sent initial data (167 bytes) 18662 1726867343.18627: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867343.18641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867343.18654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867343.18672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867343.18784: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867343.18801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867343.18828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867343.18847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867343.18948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867343.20636: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18662 1726867343.20649: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 18662 1726867343.20668: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867343.20731: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867343.20764: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpyju0f146 /root/.ansible/tmp/ansible-tmp-1726867343.1436586-20453-87341500229683/AnsiballZ_network_connections.py <<< 18662 1726867343.20788: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867343.1436586-20453-87341500229683/AnsiballZ_network_connections.py" <<< 18662 1726867343.20852: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpyju0f146" to remote "/root/.ansible/tmp/ansible-tmp-1726867343.1436586-20453-87341500229683/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867343.1436586-20453-87341500229683/AnsiballZ_network_connections.py" <<< 18662 1726867343.22152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867343.22302: stderr chunk (state=3): >>><<< 18662 1726867343.22305: stdout chunk (state=3): >>><<< 18662 1726867343.22307: done transferring module to remote 18662 1726867343.22309: _low_level_execute_command(): starting 18662 1726867343.22311: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867343.1436586-20453-87341500229683/ /root/.ansible/tmp/ansible-tmp-1726867343.1436586-20453-87341500229683/AnsiballZ_network_connections.py && sleep 0' 18662 1726867343.22790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867343.22805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867343.22821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867343.22920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867343.22935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867343.22955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867343.22969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867343.23043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867343.24983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867343.24992: stdout chunk (state=3): >>><<< 18662 1726867343.25003: stderr chunk (state=3): >>><<< 18662 1726867343.25022: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867343.25030: _low_level_execute_command(): starting 18662 1726867343.25040: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867343.1436586-20453-87341500229683/AnsiballZ_network_connections.py && sleep 0' 18662 1726867343.25644: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867343.25660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867343.25674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867343.25755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867343.25796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867343.25812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867343.25833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867343.25918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867343.53909: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xx2c3ydl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xx2c3ydl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/9e3fe38f-c3cf-40e1-9296-1c3613d05895: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 18662 1726867343.55902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867343.55926: stderr chunk (state=3): >>>Shared connection to 10.31.12.116 closed. <<< 18662 1726867343.55974: stderr chunk (state=3): >>><<< 18662 1726867343.55998: stdout chunk (state=3): >>><<< 18662 1726867343.56029: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xx2c3ydl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_xx2c3ydl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/9e3fe38f-c3cf-40e1-9296-1c3613d05895: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867343.56071: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867343.1436586-20453-87341500229683/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867343.56097: _low_level_execute_command(): starting 18662 1726867343.56171: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867343.1436586-20453-87341500229683/ > /dev/null 2>&1 && sleep 0' 18662 1726867343.56725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867343.56775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867343.56792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867343.56851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867343.56886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867343.57037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867343.57203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867343.57317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867343.57336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867343.57537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867343.59511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867343.59552: stdout chunk (state=3): >>><<< 18662 1726867343.59560: stderr chunk (state=3): >>><<< 18662 1726867343.59821: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867343.59833: handler run complete 18662 1726867343.59840: attempt loop complete, returning result 18662 1726867343.59843: _execute() done 18662 1726867343.59845: dumping result to json 18662 1726867343.59847: done dumping result, returning 18662 1726867343.59850: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-efab-a8ce-00000000006c] 18662 1726867343.59852: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000006c 18662 1726867343.59933: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000006c 18662 1726867343.59942: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 18662 1726867343.60037: no more pending results, returning what we have 18662 1726867343.60041: results queue empty 18662 1726867343.60042: checking for any_errors_fatal 18662 1726867343.60049: done checking for any_errors_fatal 18662 1726867343.60050: checking for max_fail_percentage 18662 1726867343.60052: done checking for max_fail_percentage 18662 1726867343.60052: checking to see if all hosts have failed and the running result is not ok 18662 1726867343.60054: done checking to see if all hosts have failed 18662 1726867343.60055: getting the remaining hosts for this loop 18662 1726867343.60056: done getting the remaining hosts for this loop 18662 1726867343.60068: getting the next task for host managed_node2 18662 1726867343.60075: done getting next task for host managed_node2 18662 1726867343.60081: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 18662 1726867343.60083: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867343.60092: getting variables 18662 1726867343.60094: in VariableManager get_vars() 18662 1726867343.60351: Calling all_inventory to load vars for managed_node2 18662 1726867343.60460: Calling groups_inventory to load vars for managed_node2 18662 1726867343.60464: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867343.60527: Calling all_plugins_play to load vars for managed_node2 18662 1726867343.60530: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867343.60534: Calling groups_plugins_play to load vars for managed_node2 18662 1726867343.63050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867343.64836: done with get_vars() 18662 1726867343.64919: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:22:23 -0400 (0:00:00.751) 0:00:38.285 ****** 18662 1726867343.65020: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 18662 1726867343.66292: worker is 1 (out of 1 available) 18662 1726867343.66497: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 18662 1726867343.66507: done queuing things up, now waiting for results queue to drain 18662 1726867343.66508: waiting for pending results... 18662 1726867343.66983: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 18662 1726867343.66988: in run() - task 0affcac9-a3a5-efab-a8ce-00000000006d 18662 1726867343.67070: variable 'ansible_search_path' from source: unknown 18662 1726867343.67180: variable 'ansible_search_path' from source: unknown 18662 1726867343.67183: calling self._execute() 18662 1726867343.67398: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867343.67405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867343.67408: variable 'omit' from source: magic vars 18662 1726867343.67920: variable 'ansible_distribution_major_version' from source: facts 18662 1726867343.67944: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867343.68087: variable 'network_state' from source: role '' defaults 18662 1726867343.68102: Evaluated conditional (network_state != {}): False 18662 1726867343.68155: when evaluation is False, skipping this task 18662 1726867343.68158: _execute() done 18662 1726867343.68160: dumping result to json 18662 1726867343.68163: done dumping result, returning 18662 1726867343.68165: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-efab-a8ce-00000000006d] 18662 1726867343.68167: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000006d skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 18662 1726867343.68313: no more pending results, returning what we have 18662 1726867343.68318: results queue empty 18662 1726867343.68319: checking for any_errors_fatal 18662 1726867343.68330: done checking for any_errors_fatal 18662 1726867343.68331: checking for max_fail_percentage 18662 1726867343.68333: done checking for max_fail_percentage 18662 1726867343.68334: checking to see if all hosts have failed and the running result is not ok 18662 1726867343.68335: done checking to see if all hosts have failed 18662 1726867343.68335: getting the remaining hosts for this loop 18662 1726867343.68337: done getting the remaining hosts for this loop 18662 1726867343.68341: getting the next task for host managed_node2 18662 1726867343.68350: done getting next task for host managed_node2 18662 1726867343.68353: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18662 1726867343.68356: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867343.68370: getting variables 18662 1726867343.68372: in VariableManager get_vars() 18662 1726867343.68411: Calling all_inventory to load vars for managed_node2 18662 1726867343.68413: Calling groups_inventory to load vars for managed_node2 18662 1726867343.68416: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867343.68428: Calling all_plugins_play to load vars for managed_node2 18662 1726867343.68431: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867343.68433: Calling groups_plugins_play to load vars for managed_node2 18662 1726867343.69098: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000006d 18662 1726867343.69101: WORKER PROCESS EXITING 18662 1726867343.78799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867343.81163: done with get_vars() 18662 1726867343.81187: done getting variables 18662 1726867343.81243: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:22:23 -0400 (0:00:00.162) 0:00:38.448 ****** 18662 1726867343.81268: entering _queue_task() for managed_node2/debug 18662 1726867343.81797: worker is 1 (out of 1 available) 18662 1726867343.81806: exiting _queue_task() for managed_node2/debug 18662 1726867343.81819: done queuing things up, now waiting for results queue to drain 18662 1726867343.81820: waiting for pending results... 18662 1726867343.82042: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 18662 1726867343.82293: in run() - task 0affcac9-a3a5-efab-a8ce-00000000006e 18662 1726867343.82297: variable 'ansible_search_path' from source: unknown 18662 1726867343.82299: variable 'ansible_search_path' from source: unknown 18662 1726867343.82301: calling self._execute() 18662 1726867343.82533: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867343.82546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867343.82559: variable 'omit' from source: magic vars 18662 1726867343.83043: variable 'ansible_distribution_major_version' from source: facts 18662 1726867343.83076: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867343.83190: variable 'omit' from source: magic vars 18662 1726867343.83197: variable 'omit' from source: magic vars 18662 1726867343.83331: variable 'omit' from source: magic vars 18662 1726867343.83415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867343.83566: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867343.83570: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867343.83640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867343.83694: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867343.83826: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867343.83829: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867343.83832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867343.83866: Set connection var ansible_timeout to 10 18662 1726867343.83874: Set connection var ansible_connection to ssh 18662 1726867343.83887: Set connection var ansible_shell_executable to /bin/sh 18662 1726867343.83896: Set connection var ansible_shell_type to sh 18662 1726867343.83915: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867343.83934: Set connection var ansible_pipelining to False 18662 1726867343.83991: variable 'ansible_shell_executable' from source: unknown 18662 1726867343.84004: variable 'ansible_connection' from source: unknown 18662 1726867343.84015: variable 'ansible_module_compression' from source: unknown 18662 1726867343.84022: variable 'ansible_shell_type' from source: unknown 18662 1726867343.84031: variable 'ansible_shell_executable' from source: unknown 18662 1726867343.84045: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867343.84052: variable 'ansible_pipelining' from source: unknown 18662 1726867343.84060: variable 'ansible_timeout' from source: unknown 18662 1726867343.84068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867343.84287: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867343.84366: variable 'omit' from source: magic vars 18662 1726867343.84369: starting attempt loop 18662 1726867343.84371: running the handler 18662 1726867343.84694: variable '__network_connections_result' from source: set_fact 18662 1726867343.84927: handler run complete 18662 1726867343.84931: attempt loop complete, returning result 18662 1726867343.84934: _execute() done 18662 1726867343.84937: dumping result to json 18662 1726867343.85001: done dumping result, returning 18662 1726867343.85004: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-efab-a8ce-00000000006e] 18662 1726867343.85032: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000006e 18662 1726867343.85252: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000006e 18662 1726867343.85255: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 18662 1726867343.85338: no more pending results, returning what we have 18662 1726867343.85341: results queue empty 18662 1726867343.85342: checking for any_errors_fatal 18662 1726867343.85350: done checking for any_errors_fatal 18662 1726867343.85351: checking for max_fail_percentage 18662 1726867343.85353: done checking for max_fail_percentage 18662 1726867343.85353: checking to see if all hosts have failed and the running result is not ok 18662 1726867343.85354: done checking to see if all hosts have failed 18662 1726867343.85355: getting the remaining hosts for this loop 18662 1726867343.85357: done getting the remaining hosts for this loop 18662 1726867343.85360: getting the next task for host managed_node2 18662 1726867343.85592: done getting next task for host managed_node2 18662 1726867343.85596: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18662 1726867343.85598: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867343.85607: getting variables 18662 1726867343.85611: in VariableManager get_vars() 18662 1726867343.85644: Calling all_inventory to load vars for managed_node2 18662 1726867343.85646: Calling groups_inventory to load vars for managed_node2 18662 1726867343.85648: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867343.85658: Calling all_plugins_play to load vars for managed_node2 18662 1726867343.85661: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867343.85664: Calling groups_plugins_play to load vars for managed_node2 18662 1726867343.87622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867343.89271: done with get_vars() 18662 1726867343.89294: done getting variables 18662 1726867343.89358: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:22:23 -0400 (0:00:00.081) 0:00:38.529 ****** 18662 1726867343.89389: entering _queue_task() for managed_node2/debug 18662 1726867343.89907: worker is 1 (out of 1 available) 18662 1726867343.89920: exiting _queue_task() for managed_node2/debug 18662 1726867343.89930: done queuing things up, now waiting for results queue to drain 18662 1726867343.89931: waiting for pending results... 18662 1726867343.90170: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 18662 1726867343.90175: in run() - task 0affcac9-a3a5-efab-a8ce-00000000006f 18662 1726867343.90181: variable 'ansible_search_path' from source: unknown 18662 1726867343.90184: variable 'ansible_search_path' from source: unknown 18662 1726867343.90203: calling self._execute() 18662 1726867343.90315: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867343.90330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867343.90344: variable 'omit' from source: magic vars 18662 1726867343.90815: variable 'ansible_distribution_major_version' from source: facts 18662 1726867343.90820: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867343.90822: variable 'omit' from source: magic vars 18662 1726867343.90825: variable 'omit' from source: magic vars 18662 1726867343.90863: variable 'omit' from source: magic vars 18662 1726867343.90905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867343.90960: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867343.90992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867343.91017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867343.91039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867343.91071: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867343.91085: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867343.91093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867343.91246: Set connection var ansible_timeout to 10 18662 1726867343.91249: Set connection var ansible_connection to ssh 18662 1726867343.91252: Set connection var ansible_shell_executable to /bin/sh 18662 1726867343.91255: Set connection var ansible_shell_type to sh 18662 1726867343.91257: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867343.91259: Set connection var ansible_pipelining to False 18662 1726867343.91272: variable 'ansible_shell_executable' from source: unknown 18662 1726867343.91282: variable 'ansible_connection' from source: unknown 18662 1726867343.91290: variable 'ansible_module_compression' from source: unknown 18662 1726867343.91296: variable 'ansible_shell_type' from source: unknown 18662 1726867343.91303: variable 'ansible_shell_executable' from source: unknown 18662 1726867343.91312: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867343.91355: variable 'ansible_pipelining' from source: unknown 18662 1726867343.91358: variable 'ansible_timeout' from source: unknown 18662 1726867343.91361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867343.91486: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867343.91503: variable 'omit' from source: magic vars 18662 1726867343.91516: starting attempt loop 18662 1726867343.91522: running the handler 18662 1726867343.91579: variable '__network_connections_result' from source: set_fact 18662 1726867343.91684: variable '__network_connections_result' from source: set_fact 18662 1726867343.91770: handler run complete 18662 1726867343.91814: attempt loop complete, returning result 18662 1726867343.91882: _execute() done 18662 1726867343.91885: dumping result to json 18662 1726867343.91887: done dumping result, returning 18662 1726867343.91890: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-efab-a8ce-00000000006f] 18662 1726867343.91897: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000006f 18662 1726867343.91968: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000006f 18662 1726867343.91971: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 18662 1726867343.92084: no more pending results, returning what we have 18662 1726867343.92088: results queue empty 18662 1726867343.92089: checking for any_errors_fatal 18662 1726867343.92096: done checking for any_errors_fatal 18662 1726867343.92097: checking for max_fail_percentage 18662 1726867343.92098: done checking for max_fail_percentage 18662 1726867343.92099: checking to see if all hosts have failed and the running result is not ok 18662 1726867343.92100: done checking to see if all hosts have failed 18662 1726867343.92101: getting the remaining hosts for this loop 18662 1726867343.92102: done getting the remaining hosts for this loop 18662 1726867343.92106: getting the next task for host managed_node2 18662 1726867343.92115: done getting next task for host managed_node2 18662 1726867343.92118: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18662 1726867343.92120: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867343.92129: getting variables 18662 1726867343.92131: in VariableManager get_vars() 18662 1726867343.92167: Calling all_inventory to load vars for managed_node2 18662 1726867343.92169: Calling groups_inventory to load vars for managed_node2 18662 1726867343.92171: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867343.92295: Calling all_plugins_play to load vars for managed_node2 18662 1726867343.92300: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867343.92303: Calling groups_plugins_play to load vars for managed_node2 18662 1726867343.93982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867343.95657: done with get_vars() 18662 1726867343.95676: done getting variables 18662 1726867343.95735: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:22:23 -0400 (0:00:00.063) 0:00:38.593 ****** 18662 1726867343.95772: entering _queue_task() for managed_node2/debug 18662 1726867343.96068: worker is 1 (out of 1 available) 18662 1726867343.96282: exiting _queue_task() for managed_node2/debug 18662 1726867343.96292: done queuing things up, now waiting for results queue to drain 18662 1726867343.96294: waiting for pending results... 18662 1726867343.96424: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 18662 1726867343.96521: in run() - task 0affcac9-a3a5-efab-a8ce-000000000070 18662 1726867343.96525: variable 'ansible_search_path' from source: unknown 18662 1726867343.96528: variable 'ansible_search_path' from source: unknown 18662 1726867343.96550: calling self._execute() 18662 1726867343.96784: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867343.96788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867343.96791: variable 'omit' from source: magic vars 18662 1726867343.97655: variable 'ansible_distribution_major_version' from source: facts 18662 1726867343.97660: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867343.97854: variable 'network_state' from source: role '' defaults 18662 1726867343.97980: Evaluated conditional (network_state != {}): False 18662 1726867343.97983: when evaluation is False, skipping this task 18662 1726867343.97986: _execute() done 18662 1726867343.98072: dumping result to json 18662 1726867343.98075: done dumping result, returning 18662 1726867343.98080: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-efab-a8ce-000000000070] 18662 1726867343.98083: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000070 18662 1726867343.98156: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000070 18662 1726867343.98159: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 18662 1726867343.98208: no more pending results, returning what we have 18662 1726867343.98214: results queue empty 18662 1726867343.98216: checking for any_errors_fatal 18662 1726867343.98225: done checking for any_errors_fatal 18662 1726867343.98226: checking for max_fail_percentage 18662 1726867343.98228: done checking for max_fail_percentage 18662 1726867343.98228: checking to see if all hosts have failed and the running result is not ok 18662 1726867343.98229: done checking to see if all hosts have failed 18662 1726867343.98230: getting the remaining hosts for this loop 18662 1726867343.98231: done getting the remaining hosts for this loop 18662 1726867343.98235: getting the next task for host managed_node2 18662 1726867343.98242: done getting next task for host managed_node2 18662 1726867343.98245: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 18662 1726867343.98248: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867343.98261: getting variables 18662 1726867343.98263: in VariableManager get_vars() 18662 1726867343.98302: Calling all_inventory to load vars for managed_node2 18662 1726867343.98304: Calling groups_inventory to load vars for managed_node2 18662 1726867343.98307: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867343.98321: Calling all_plugins_play to load vars for managed_node2 18662 1726867343.98325: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867343.98328: Calling groups_plugins_play to load vars for managed_node2 18662 1726867344.01663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867344.04992: done with get_vars() 18662 1726867344.05017: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:22:24 -0400 (0:00:00.093) 0:00:38.686 ****** 18662 1726867344.05120: entering _queue_task() for managed_node2/ping 18662 1726867344.05507: worker is 1 (out of 1 available) 18662 1726867344.05521: exiting _queue_task() for managed_node2/ping 18662 1726867344.05532: done queuing things up, now waiting for results queue to drain 18662 1726867344.05533: waiting for pending results... 18662 1726867344.05815: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 18662 1726867344.05913: in run() - task 0affcac9-a3a5-efab-a8ce-000000000071 18662 1726867344.06020: variable 'ansible_search_path' from source: unknown 18662 1726867344.06024: variable 'ansible_search_path' from source: unknown 18662 1726867344.06027: calling self._execute() 18662 1726867344.06076: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867344.06091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867344.06106: variable 'omit' from source: magic vars 18662 1726867344.06504: variable 'ansible_distribution_major_version' from source: facts 18662 1726867344.06525: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867344.06537: variable 'omit' from source: magic vars 18662 1726867344.06586: variable 'omit' from source: magic vars 18662 1726867344.06626: variable 'omit' from source: magic vars 18662 1726867344.06672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867344.06716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867344.06740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867344.06761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867344.06781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867344.06818: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867344.06882: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867344.06892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867344.06942: Set connection var ansible_timeout to 10 18662 1726867344.06949: Set connection var ansible_connection to ssh 18662 1726867344.06958: Set connection var ansible_shell_executable to /bin/sh 18662 1726867344.06963: Set connection var ansible_shell_type to sh 18662 1726867344.06973: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867344.06984: Set connection var ansible_pipelining to False 18662 1726867344.07286: variable 'ansible_shell_executable' from source: unknown 18662 1726867344.07290: variable 'ansible_connection' from source: unknown 18662 1726867344.07292: variable 'ansible_module_compression' from source: unknown 18662 1726867344.07294: variable 'ansible_shell_type' from source: unknown 18662 1726867344.07296: variable 'ansible_shell_executable' from source: unknown 18662 1726867344.07298: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867344.07300: variable 'ansible_pipelining' from source: unknown 18662 1726867344.07302: variable 'ansible_timeout' from source: unknown 18662 1726867344.07304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867344.07557: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867344.07574: variable 'omit' from source: magic vars 18662 1726867344.07586: starting attempt loop 18662 1726867344.07593: running the handler 18662 1726867344.07720: _low_level_execute_command(): starting 18662 1726867344.07723: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867344.09160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867344.09195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867344.09211: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867344.09368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867344.09447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867344.09615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867344.09660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867344.11341: stdout chunk (state=3): >>>/root <<< 18662 1726867344.11496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867344.11507: stdout chunk (state=3): >>><<< 18662 1726867344.11524: stderr chunk (state=3): >>><<< 18662 1726867344.11557: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867344.11749: _low_level_execute_command(): starting 18662 1726867344.11753: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867344.1164753-20497-55417368770051 `" && echo ansible-tmp-1726867344.1164753-20497-55417368770051="` echo /root/.ansible/tmp/ansible-tmp-1726867344.1164753-20497-55417368770051 `" ) && sleep 0' 18662 1726867344.12792: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867344.12806: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867344.12830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867344.12859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867344.12994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867344.13158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867344.13184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867344.13319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867344.15243: stdout chunk (state=3): >>>ansible-tmp-1726867344.1164753-20497-55417368770051=/root/.ansible/tmp/ansible-tmp-1726867344.1164753-20497-55417368770051 <<< 18662 1726867344.15401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867344.15404: stdout chunk (state=3): >>><<< 18662 1726867344.15406: stderr chunk (state=3): >>><<< 18662 1726867344.15432: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867344.1164753-20497-55417368770051=/root/.ansible/tmp/ansible-tmp-1726867344.1164753-20497-55417368770051 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867344.15659: variable 'ansible_module_compression' from source: unknown 18662 1726867344.15670: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 18662 1726867344.15715: variable 'ansible_facts' from source: unknown 18662 1726867344.15983: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867344.1164753-20497-55417368770051/AnsiballZ_ping.py 18662 1726867344.16217: Sending initial data 18662 1726867344.16227: Sent initial data (152 bytes) 18662 1726867344.17493: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867344.17497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867344.17591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867344.17667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867344.17767: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867344.17802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867344.19396: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867344.19452: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867344.19471: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmp31ooq1ti /root/.ansible/tmp/ansible-tmp-1726867344.1164753-20497-55417368770051/AnsiballZ_ping.py <<< 18662 1726867344.19475: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867344.1164753-20497-55417368770051/AnsiballZ_ping.py" <<< 18662 1726867344.19516: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmp31ooq1ti" to remote "/root/.ansible/tmp/ansible-tmp-1726867344.1164753-20497-55417368770051/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867344.1164753-20497-55417368770051/AnsiballZ_ping.py" <<< 18662 1726867344.20829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867344.20839: stderr chunk (state=3): >>><<< 18662 1726867344.20848: stdout chunk (state=3): >>><<< 18662 1726867344.21063: done transferring module to remote 18662 1726867344.21066: _low_level_execute_command(): starting 18662 1726867344.21069: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867344.1164753-20497-55417368770051/ /root/.ansible/tmp/ansible-tmp-1726867344.1164753-20497-55417368770051/AnsiballZ_ping.py && sleep 0' 18662 1726867344.22498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867344.22524: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867344.22543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867344.22564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867344.22734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867344.24676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867344.24792: stdout chunk (state=3): >>><<< 18662 1726867344.24833: stderr chunk (state=3): >>><<< 18662 1726867344.24913: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867344.25097: _low_level_execute_command(): starting 18662 1726867344.25105: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867344.1164753-20497-55417368770051/AnsiballZ_ping.py && sleep 0' 18662 1726867344.26725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867344.26729: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867344.26731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867344.26855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867344.27186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867344.27400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867344.42960: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 18662 1726867344.44323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867344.44327: stderr chunk (state=3): >>>Shared connection to 10.31.12.116 closed. <<< 18662 1726867344.44419: stderr chunk (state=3): >>><<< 18662 1726867344.44423: stdout chunk (state=3): >>><<< 18662 1726867344.44631: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867344.44635: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867344.1164753-20497-55417368770051/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867344.44637: _low_level_execute_command(): starting 18662 1726867344.44640: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867344.1164753-20497-55417368770051/ > /dev/null 2>&1 && sleep 0' 18662 1726867344.46060: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867344.46394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867344.46602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867344.46683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867344.48786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867344.48789: stdout chunk (state=3): >>><<< 18662 1726867344.48791: stderr chunk (state=3): >>><<< 18662 1726867344.48794: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867344.48796: handler run complete 18662 1726867344.48798: attempt loop complete, returning result 18662 1726867344.48799: _execute() done 18662 1726867344.48801: dumping result to json 18662 1726867344.48803: done dumping result, returning 18662 1726867344.48805: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-efab-a8ce-000000000071] 18662 1726867344.48807: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000071 18662 1726867344.49186: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000071 18662 1726867344.49189: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 18662 1726867344.49248: no more pending results, returning what we have 18662 1726867344.49256: results queue empty 18662 1726867344.49258: checking for any_errors_fatal 18662 1726867344.49267: done checking for any_errors_fatal 18662 1726867344.49267: checking for max_fail_percentage 18662 1726867344.49269: done checking for max_fail_percentage 18662 1726867344.49270: checking to see if all hosts have failed and the running result is not ok 18662 1726867344.49271: done checking to see if all hosts have failed 18662 1726867344.49271: getting the remaining hosts for this loop 18662 1726867344.49273: done getting the remaining hosts for this loop 18662 1726867344.49279: getting the next task for host managed_node2 18662 1726867344.49289: done getting next task for host managed_node2 18662 1726867344.49291: ^ task is: TASK: meta (role_complete) 18662 1726867344.49293: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867344.49304: getting variables 18662 1726867344.49306: in VariableManager get_vars() 18662 1726867344.49348: Calling all_inventory to load vars for managed_node2 18662 1726867344.49351: Calling groups_inventory to load vars for managed_node2 18662 1726867344.49353: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867344.49364: Calling all_plugins_play to load vars for managed_node2 18662 1726867344.49368: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867344.49370: Calling groups_plugins_play to load vars for managed_node2 18662 1726867344.52455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867344.55947: done with get_vars() 18662 1726867344.55979: done getting variables 18662 1726867344.56117: done queuing things up, now waiting for results queue to drain 18662 1726867344.56120: results queue empty 18662 1726867344.56121: checking for any_errors_fatal 18662 1726867344.56124: done checking for any_errors_fatal 18662 1726867344.56125: checking for max_fail_percentage 18662 1726867344.56126: done checking for max_fail_percentage 18662 1726867344.56127: checking to see if all hosts have failed and the running result is not ok 18662 1726867344.56127: done checking to see if all hosts have failed 18662 1726867344.56128: getting the remaining hosts for this loop 18662 1726867344.56129: done getting the remaining hosts for this loop 18662 1726867344.56132: getting the next task for host managed_node2 18662 1726867344.56250: done getting next task for host managed_node2 18662 1726867344.56253: ^ task is: TASK: meta (flush_handlers) 18662 1726867344.56254: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867344.56257: getting variables 18662 1726867344.56258: in VariableManager get_vars() 18662 1726867344.56271: Calling all_inventory to load vars for managed_node2 18662 1726867344.56273: Calling groups_inventory to load vars for managed_node2 18662 1726867344.56275: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867344.56282: Calling all_plugins_play to load vars for managed_node2 18662 1726867344.56284: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867344.56287: Calling groups_plugins_play to load vars for managed_node2 18662 1726867344.58960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867344.61750: done with get_vars() 18662 1726867344.61794: done getting variables 18662 1726867344.61898: in VariableManager get_vars() 18662 1726867344.61916: Calling all_inventory to load vars for managed_node2 18662 1726867344.61919: Calling groups_inventory to load vars for managed_node2 18662 1726867344.61921: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867344.61926: Calling all_plugins_play to load vars for managed_node2 18662 1726867344.61928: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867344.61931: Calling groups_plugins_play to load vars for managed_node2 18662 1726867344.63925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867344.65764: done with get_vars() 18662 1726867344.65797: done queuing things up, now waiting for results queue to drain 18662 1726867344.65800: results queue empty 18662 1726867344.65801: checking for any_errors_fatal 18662 1726867344.65802: done checking for any_errors_fatal 18662 1726867344.65803: checking for max_fail_percentage 18662 1726867344.65804: done checking for max_fail_percentage 18662 1726867344.65805: checking to see if all hosts have failed and the running result is not ok 18662 1726867344.65805: done checking to see if all hosts have failed 18662 1726867344.65806: getting the remaining hosts for this loop 18662 1726867344.65807: done getting the remaining hosts for this loop 18662 1726867344.65812: getting the next task for host managed_node2 18662 1726867344.65816: done getting next task for host managed_node2 18662 1726867344.65818: ^ task is: TASK: meta (flush_handlers) 18662 1726867344.65819: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867344.65822: getting variables 18662 1726867344.65823: in VariableManager get_vars() 18662 1726867344.65835: Calling all_inventory to load vars for managed_node2 18662 1726867344.65838: Calling groups_inventory to load vars for managed_node2 18662 1726867344.65840: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867344.65845: Calling all_plugins_play to load vars for managed_node2 18662 1726867344.65848: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867344.65850: Calling groups_plugins_play to load vars for managed_node2 18662 1726867344.68361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867344.71852: done with get_vars() 18662 1726867344.71875: done getting variables 18662 1726867344.71928: in VariableManager get_vars() 18662 1726867344.71944: Calling all_inventory to load vars for managed_node2 18662 1726867344.71947: Calling groups_inventory to load vars for managed_node2 18662 1726867344.71951: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867344.71956: Calling all_plugins_play to load vars for managed_node2 18662 1726867344.71958: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867344.71961: Calling groups_plugins_play to load vars for managed_node2 18662 1726867344.74555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867344.77920: done with get_vars() 18662 1726867344.77959: done queuing things up, now waiting for results queue to drain 18662 1726867344.77961: results queue empty 18662 1726867344.77962: checking for any_errors_fatal 18662 1726867344.77963: done checking for any_errors_fatal 18662 1726867344.77964: checking for max_fail_percentage 18662 1726867344.77965: done checking for max_fail_percentage 18662 1726867344.77966: checking to see if all hosts have failed and the running result is not ok 18662 1726867344.77967: done checking to see if all hosts have failed 18662 1726867344.77967: getting the remaining hosts for this loop 18662 1726867344.77969: done getting the remaining hosts for this loop 18662 1726867344.77971: getting the next task for host managed_node2 18662 1726867344.77981: done getting next task for host managed_node2 18662 1726867344.77982: ^ task is: None 18662 1726867344.77984: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867344.77985: done queuing things up, now waiting for results queue to drain 18662 1726867344.77986: results queue empty 18662 1726867344.77986: checking for any_errors_fatal 18662 1726867344.77987: done checking for any_errors_fatal 18662 1726867344.77988: checking for max_fail_percentage 18662 1726867344.77989: done checking for max_fail_percentage 18662 1726867344.77990: checking to see if all hosts have failed and the running result is not ok 18662 1726867344.77990: done checking to see if all hosts have failed 18662 1726867344.77992: getting the next task for host managed_node2 18662 1726867344.77994: done getting next task for host managed_node2 18662 1726867344.77994: ^ task is: None 18662 1726867344.77996: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867344.78165: in VariableManager get_vars() 18662 1726867344.78184: done with get_vars() 18662 1726867344.78190: in VariableManager get_vars() 18662 1726867344.78198: done with get_vars() 18662 1726867344.78202: variable 'omit' from source: magic vars 18662 1726867344.78233: in VariableManager get_vars() 18662 1726867344.78244: done with get_vars() 18662 1726867344.78396: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 18662 1726867344.78868: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18662 1726867344.79003: getting the remaining hosts for this loop 18662 1726867344.79005: done getting the remaining hosts for this loop 18662 1726867344.79007: getting the next task for host managed_node2 18662 1726867344.79010: done getting next task for host managed_node2 18662 1726867344.79012: ^ task is: TASK: Gathering Facts 18662 1726867344.79014: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867344.79015: getting variables 18662 1726867344.79016: in VariableManager get_vars() 18662 1726867344.79025: Calling all_inventory to load vars for managed_node2 18662 1726867344.79027: Calling groups_inventory to load vars for managed_node2 18662 1726867344.79034: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867344.79040: Calling all_plugins_play to load vars for managed_node2 18662 1726867344.79042: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867344.79045: Calling groups_plugins_play to load vars for managed_node2 18662 1726867344.82635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867344.86616: done with get_vars() 18662 1726867344.86643: done getting variables 18662 1726867344.86895: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Friday 20 September 2024 17:22:24 -0400 (0:00:00.818) 0:00:39.504 ****** 18662 1726867344.86924: entering _queue_task() for managed_node2/gather_facts 18662 1726867344.87682: worker is 1 (out of 1 available) 18662 1726867344.87693: exiting _queue_task() for managed_node2/gather_facts 18662 1726867344.87702: done queuing things up, now waiting for results queue to drain 18662 1726867344.87704: waiting for pending results... 18662 1726867344.88122: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18662 1726867344.88223: in run() - task 0affcac9-a3a5-efab-a8ce-0000000004e4 18662 1726867344.88228: variable 'ansible_search_path' from source: unknown 18662 1726867344.88360: calling self._execute() 18662 1726867344.88505: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867344.88557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867344.88578: variable 'omit' from source: magic vars 18662 1726867344.89666: variable 'ansible_distribution_major_version' from source: facts 18662 1726867344.89882: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867344.89887: variable 'omit' from source: magic vars 18662 1726867344.89890: variable 'omit' from source: magic vars 18662 1726867344.89892: variable 'omit' from source: magic vars 18662 1726867344.90046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867344.90240: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867344.90250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867344.90281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867344.90326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867344.90464: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867344.90492: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867344.90497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867344.90821: Set connection var ansible_timeout to 10 18662 1726867344.90902: Set connection var ansible_connection to ssh 18662 1726867344.90913: Set connection var ansible_shell_executable to /bin/sh 18662 1726867344.90920: Set connection var ansible_shell_type to sh 18662 1726867344.91001: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867344.91004: Set connection var ansible_pipelining to False 18662 1726867344.91007: variable 'ansible_shell_executable' from source: unknown 18662 1726867344.91114: variable 'ansible_connection' from source: unknown 18662 1726867344.91117: variable 'ansible_module_compression' from source: unknown 18662 1726867344.91121: variable 'ansible_shell_type' from source: unknown 18662 1726867344.91123: variable 'ansible_shell_executable' from source: unknown 18662 1726867344.91125: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867344.91127: variable 'ansible_pipelining' from source: unknown 18662 1726867344.91129: variable 'ansible_timeout' from source: unknown 18662 1726867344.91131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867344.91465: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867344.91484: variable 'omit' from source: magic vars 18662 1726867344.91495: starting attempt loop 18662 1726867344.91502: running the handler 18662 1726867344.91528: variable 'ansible_facts' from source: unknown 18662 1726867344.91635: _low_level_execute_command(): starting 18662 1726867344.91638: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867344.92522: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867344.92586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867344.92621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867344.92655: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867344.92735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867344.94461: stdout chunk (state=3): >>>/root <<< 18662 1726867344.94636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867344.94665: stderr chunk (state=3): >>><<< 18662 1726867344.94668: stdout chunk (state=3): >>><<< 18662 1726867344.94791: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867344.94888: _low_level_execute_command(): starting 18662 1726867344.94893: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867344.9479804-20533-38620514991800 `" && echo ansible-tmp-1726867344.9479804-20533-38620514991800="` echo /root/.ansible/tmp/ansible-tmp-1726867344.9479804-20533-38620514991800 `" ) && sleep 0' 18662 1726867344.96797: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867344.96913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867344.96956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867344.96976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867344.97043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867344.99190: stdout chunk (state=3): >>>ansible-tmp-1726867344.9479804-20533-38620514991800=/root/.ansible/tmp/ansible-tmp-1726867344.9479804-20533-38620514991800 <<< 18662 1726867344.99297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867344.99352: stderr chunk (state=3): >>><<< 18662 1726867344.99362: stdout chunk (state=3): >>><<< 18662 1726867344.99583: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867344.9479804-20533-38620514991800=/root/.ansible/tmp/ansible-tmp-1726867344.9479804-20533-38620514991800 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867344.99587: variable 'ansible_module_compression' from source: unknown 18662 1726867344.99589: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18662 1726867344.99592: variable 'ansible_facts' from source: unknown 18662 1726867345.00034: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867344.9479804-20533-38620514991800/AnsiballZ_setup.py 18662 1726867345.00782: Sending initial data 18662 1726867345.00786: Sent initial data (153 bytes) 18662 1726867345.01697: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867345.01746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867345.01793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867345.01812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867345.01989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867345.03836: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867345.03927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867344.9479804-20533-38620514991800/AnsiballZ_setup.py" <<< 18662 1726867345.03938: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmppbgix1s_ /root/.ansible/tmp/ansible-tmp-1726867344.9479804-20533-38620514991800/AnsiballZ_setup.py <<< 18662 1726867345.03972: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmppbgix1s_" to remote "/root/.ansible/tmp/ansible-tmp-1726867344.9479804-20533-38620514991800/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867344.9479804-20533-38620514991800/AnsiballZ_setup.py" <<< 18662 1726867345.06042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867345.06071: stderr chunk (state=3): >>><<< 18662 1726867345.06081: stdout chunk (state=3): >>><<< 18662 1726867345.06134: done transferring module to remote 18662 1726867345.06155: _low_level_execute_command(): starting 18662 1726867345.06163: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867344.9479804-20533-38620514991800/ /root/.ansible/tmp/ansible-tmp-1726867344.9479804-20533-38620514991800/AnsiballZ_setup.py && sleep 0' 18662 1726867345.06847: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867345.06948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867345.06968: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867345.06989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867345.07075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867345.09030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867345.09065: stdout chunk (state=3): >>><<< 18662 1726867345.09201: stderr chunk (state=3): >>><<< 18662 1726867345.09211: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867345.09297: _low_level_execute_command(): starting 18662 1726867345.09301: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867344.9479804-20533-38620514991800/AnsiballZ_setup.py && sleep 0' 18662 1726867345.10190: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867345.10224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867345.10238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867345.10291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867345.10431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867345.10446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867345.10465: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867345.10602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867345.73051: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "22", "second": "25", "epoch": "1726867345", "epoch_int": "1726867345", "date": "2024-09-20", "time": "17:22:25", "iso8601_micro": "2024-09-20T21:22:25.383993Z", "iso8601": "2024-09-20T21:22:25Z", "iso8601_basic": "20240920T172225383993", "iso8601_basic_short": "20240920T172225", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.345703125, "5m": 0.3720703125, "15m": 0.2021484375}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_pkg_mgr": "dnf", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2949, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 582, "free": 2949}, "nocache": {"free": 3287, "used": 244}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 583, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794881536, "block_size": 4096, "block_total": 65519099, "block_available": 63914766, "block_used": 1604333, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18662 1726867345.75256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867345.75271: stderr chunk (state=3): >>>Shared connection to 10.31.12.116 closed. <<< 18662 1726867345.75323: stderr chunk (state=3): >>><<< 18662 1726867345.75335: stdout chunk (state=3): >>><<< 18662 1726867345.75384: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "22", "second": "25", "epoch": "1726867345", "epoch_int": "1726867345", "date": "2024-09-20", "time": "17:22:25", "iso8601_micro": "2024-09-20T21:22:25.383993Z", "iso8601": "2024-09-20T21:22:25Z", "iso8601_basic": "20240920T172225383993", "iso8601_basic_short": "20240920T172225", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.345703125, "5m": 0.3720703125, "15m": 0.2021484375}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_pkg_mgr": "dnf", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2949, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 582, "free": 2949}, "nocache": {"free": 3287, "used": 244}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 583, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794881536, "block_size": 4096, "block_total": 65519099, "block_available": 63914766, "block_used": 1604333, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867345.75787: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867344.9479804-20533-38620514991800/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867345.75803: _low_level_execute_command(): starting 18662 1726867345.75812: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867344.9479804-20533-38620514991800/ > /dev/null 2>&1 && sleep 0' 18662 1726867345.76429: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867345.76455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867345.76473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867345.76493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867345.76562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867345.76614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867345.76628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867345.76753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867345.76882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867345.78794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867345.78839: stderr chunk (state=3): >>><<< 18662 1726867345.78860: stdout chunk (state=3): >>><<< 18662 1726867345.78895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867345.78907: handler run complete 18662 1726867345.79016: variable 'ansible_facts' from source: unknown 18662 1726867345.79105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867345.79357: variable 'ansible_facts' from source: unknown 18662 1726867345.79410: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867345.79504: attempt loop complete, returning result 18662 1726867345.79507: _execute() done 18662 1726867345.79509: dumping result to json 18662 1726867345.79529: done dumping result, returning 18662 1726867345.79548: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-efab-a8ce-0000000004e4] 18662 1726867345.79551: sending task result for task 0affcac9-a3a5-efab-a8ce-0000000004e4 18662 1726867345.79990: done sending task result for task 0affcac9-a3a5-efab-a8ce-0000000004e4 18662 1726867345.79994: WORKER PROCESS EXITING ok: [managed_node2] 18662 1726867345.80336: no more pending results, returning what we have 18662 1726867345.80340: results queue empty 18662 1726867345.80341: checking for any_errors_fatal 18662 1726867345.80342: done checking for any_errors_fatal 18662 1726867345.80343: checking for max_fail_percentage 18662 1726867345.80344: done checking for max_fail_percentage 18662 1726867345.80345: checking to see if all hosts have failed and the running result is not ok 18662 1726867345.80346: done checking to see if all hosts have failed 18662 1726867345.80347: getting the remaining hosts for this loop 18662 1726867345.80348: done getting the remaining hosts for this loop 18662 1726867345.80351: getting the next task for host managed_node2 18662 1726867345.80358: done getting next task for host managed_node2 18662 1726867345.80360: ^ task is: TASK: meta (flush_handlers) 18662 1726867345.80362: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867345.80366: getting variables 18662 1726867345.80367: in VariableManager get_vars() 18662 1726867345.80392: Calling all_inventory to load vars for managed_node2 18662 1726867345.80394: Calling groups_inventory to load vars for managed_node2 18662 1726867345.80397: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867345.80407: Calling all_plugins_play to load vars for managed_node2 18662 1726867345.80411: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867345.80416: Calling groups_plugins_play to load vars for managed_node2 18662 1726867345.81970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867345.83443: done with get_vars() 18662 1726867345.83467: done getting variables 18662 1726867345.83539: in VariableManager get_vars() 18662 1726867345.83545: Calling all_inventory to load vars for managed_node2 18662 1726867345.83547: Calling groups_inventory to load vars for managed_node2 18662 1726867345.83548: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867345.83552: Calling all_plugins_play to load vars for managed_node2 18662 1726867345.83553: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867345.83555: Calling groups_plugins_play to load vars for managed_node2 18662 1726867345.84445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867345.85518: done with get_vars() 18662 1726867345.85543: done queuing things up, now waiting for results queue to drain 18662 1726867345.85546: results queue empty 18662 1726867345.85547: checking for any_errors_fatal 18662 1726867345.85549: done checking for any_errors_fatal 18662 1726867345.85554: checking for max_fail_percentage 18662 1726867345.85555: done checking for max_fail_percentage 18662 1726867345.85556: checking to see if all hosts have failed and the running result is not ok 18662 1726867345.85557: done checking to see if all hosts have failed 18662 1726867345.85557: getting the remaining hosts for this loop 18662 1726867345.85558: done getting the remaining hosts for this loop 18662 1726867345.85561: getting the next task for host managed_node2 18662 1726867345.85564: done getting next task for host managed_node2 18662 1726867345.85567: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 18662 1726867345.85568: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867345.85570: getting variables 18662 1726867345.85571: in VariableManager get_vars() 18662 1726867345.85581: Calling all_inventory to load vars for managed_node2 18662 1726867345.85583: Calling groups_inventory to load vars for managed_node2 18662 1726867345.85585: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867345.85590: Calling all_plugins_play to load vars for managed_node2 18662 1726867345.85592: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867345.85595: Calling groups_plugins_play to load vars for managed_node2 18662 1726867345.86589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867345.87565: done with get_vars() 18662 1726867345.87580: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:71 Friday 20 September 2024 17:22:25 -0400 (0:00:01.007) 0:00:40.511 ****** 18662 1726867345.87635: entering _queue_task() for managed_node2/include_tasks 18662 1726867345.87875: worker is 1 (out of 1 available) 18662 1726867345.87889: exiting _queue_task() for managed_node2/include_tasks 18662 1726867345.87901: done queuing things up, now waiting for results queue to drain 18662 1726867345.87902: waiting for pending results... 18662 1726867345.88115: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_absent.yml' 18662 1726867345.88188: in run() - task 0affcac9-a3a5-efab-a8ce-000000000074 18662 1726867345.88200: variable 'ansible_search_path' from source: unknown 18662 1726867345.88242: calling self._execute() 18662 1726867345.88386: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867345.88394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867345.88397: variable 'omit' from source: magic vars 18662 1726867345.88716: variable 'ansible_distribution_major_version' from source: facts 18662 1726867345.88726: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867345.88734: _execute() done 18662 1726867345.88737: dumping result to json 18662 1726867345.88739: done dumping result, returning 18662 1726867345.88744: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_absent.yml' [0affcac9-a3a5-efab-a8ce-000000000074] 18662 1726867345.88761: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000074 18662 1726867345.88861: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000074 18662 1726867345.88864: WORKER PROCESS EXITING 18662 1726867345.88891: no more pending results, returning what we have 18662 1726867345.88898: in VariableManager get_vars() 18662 1726867345.88934: Calling all_inventory to load vars for managed_node2 18662 1726867345.88936: Calling groups_inventory to load vars for managed_node2 18662 1726867345.88940: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867345.88951: Calling all_plugins_play to load vars for managed_node2 18662 1726867345.88954: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867345.88956: Calling groups_plugins_play to load vars for managed_node2 18662 1726867345.89913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867345.91030: done with get_vars() 18662 1726867345.91043: variable 'ansible_search_path' from source: unknown 18662 1726867345.91054: we have included files to process 18662 1726867345.91055: generating all_blocks data 18662 1726867345.91056: done generating all_blocks data 18662 1726867345.91056: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 18662 1726867345.91057: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 18662 1726867345.91059: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 18662 1726867345.91183: in VariableManager get_vars() 18662 1726867345.91193: done with get_vars() 18662 1726867345.91270: done processing included file 18662 1726867345.91272: iterating over new_blocks loaded from include file 18662 1726867345.91273: in VariableManager get_vars() 18662 1726867345.91284: done with get_vars() 18662 1726867345.91285: filtering new block on tags 18662 1726867345.91295: done filtering new block on tags 18662 1726867345.91297: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 18662 1726867345.91300: extending task lists for all hosts with included blocks 18662 1726867345.91330: done extending task lists 18662 1726867345.91331: done processing included files 18662 1726867345.91331: results queue empty 18662 1726867345.91332: checking for any_errors_fatal 18662 1726867345.91332: done checking for any_errors_fatal 18662 1726867345.91333: checking for max_fail_percentage 18662 1726867345.91334: done checking for max_fail_percentage 18662 1726867345.91334: checking to see if all hosts have failed and the running result is not ok 18662 1726867345.91335: done checking to see if all hosts have failed 18662 1726867345.91335: getting the remaining hosts for this loop 18662 1726867345.91336: done getting the remaining hosts for this loop 18662 1726867345.91337: getting the next task for host managed_node2 18662 1726867345.91340: done getting next task for host managed_node2 18662 1726867345.91341: ^ task is: TASK: Include the task 'get_profile_stat.yml' 18662 1726867345.91343: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867345.91344: getting variables 18662 1726867345.91345: in VariableManager get_vars() 18662 1726867345.91350: Calling all_inventory to load vars for managed_node2 18662 1726867345.91351: Calling groups_inventory to load vars for managed_node2 18662 1726867345.91353: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867345.91356: Calling all_plugins_play to load vars for managed_node2 18662 1726867345.91357: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867345.91359: Calling groups_plugins_play to load vars for managed_node2 18662 1726867345.92242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867345.93591: done with get_vars() 18662 1726867345.93605: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 17:22:25 -0400 (0:00:00.060) 0:00:40.572 ****** 18662 1726867345.93662: entering _queue_task() for managed_node2/include_tasks 18662 1726867345.94015: worker is 1 (out of 1 available) 18662 1726867345.94029: exiting _queue_task() for managed_node2/include_tasks 18662 1726867345.94043: done queuing things up, now waiting for results queue to drain 18662 1726867345.94044: waiting for pending results... 18662 1726867345.94247: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 18662 1726867345.94313: in run() - task 0affcac9-a3a5-efab-a8ce-0000000004f5 18662 1726867345.94327: variable 'ansible_search_path' from source: unknown 18662 1726867345.94331: variable 'ansible_search_path' from source: unknown 18662 1726867345.94358: calling self._execute() 18662 1726867345.94432: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867345.94439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867345.94448: variable 'omit' from source: magic vars 18662 1726867345.95082: variable 'ansible_distribution_major_version' from source: facts 18662 1726867345.95086: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867345.95386: _execute() done 18662 1726867345.95390: dumping result to json 18662 1726867345.95393: done dumping result, returning 18662 1726867345.95395: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-efab-a8ce-0000000004f5] 18662 1726867345.95398: sending task result for task 0affcac9-a3a5-efab-a8ce-0000000004f5 18662 1726867345.95462: done sending task result for task 0affcac9-a3a5-efab-a8ce-0000000004f5 18662 1726867345.95466: WORKER PROCESS EXITING 18662 1726867345.95516: no more pending results, returning what we have 18662 1726867345.95523: in VariableManager get_vars() 18662 1726867345.95555: Calling all_inventory to load vars for managed_node2 18662 1726867345.95558: Calling groups_inventory to load vars for managed_node2 18662 1726867345.95562: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867345.95575: Calling all_plugins_play to load vars for managed_node2 18662 1726867345.95582: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867345.95585: Calling groups_plugins_play to load vars for managed_node2 18662 1726867345.96626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867345.97515: done with get_vars() 18662 1726867345.97527: variable 'ansible_search_path' from source: unknown 18662 1726867345.97528: variable 'ansible_search_path' from source: unknown 18662 1726867345.97553: we have included files to process 18662 1726867345.97554: generating all_blocks data 18662 1726867345.97555: done generating all_blocks data 18662 1726867345.97555: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 18662 1726867345.97556: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 18662 1726867345.97557: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 18662 1726867345.98254: done processing included file 18662 1726867345.98256: iterating over new_blocks loaded from include file 18662 1726867345.98257: in VariableManager get_vars() 18662 1726867345.98265: done with get_vars() 18662 1726867345.98266: filtering new block on tags 18662 1726867345.98281: done filtering new block on tags 18662 1726867345.98282: in VariableManager get_vars() 18662 1726867345.98289: done with get_vars() 18662 1726867345.98290: filtering new block on tags 18662 1726867345.98303: done filtering new block on tags 18662 1726867345.98305: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 18662 1726867345.98310: extending task lists for all hosts with included blocks 18662 1726867345.98370: done extending task lists 18662 1726867345.98371: done processing included files 18662 1726867345.98371: results queue empty 18662 1726867345.98372: checking for any_errors_fatal 18662 1726867345.98373: done checking for any_errors_fatal 18662 1726867345.98374: checking for max_fail_percentage 18662 1726867345.98374: done checking for max_fail_percentage 18662 1726867345.98375: checking to see if all hosts have failed and the running result is not ok 18662 1726867345.98375: done checking to see if all hosts have failed 18662 1726867345.98376: getting the remaining hosts for this loop 18662 1726867345.98376: done getting the remaining hosts for this loop 18662 1726867345.98379: getting the next task for host managed_node2 18662 1726867345.98382: done getting next task for host managed_node2 18662 1726867345.98384: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 18662 1726867345.98386: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867345.98387: getting variables 18662 1726867345.98388: in VariableManager get_vars() 18662 1726867345.98427: Calling all_inventory to load vars for managed_node2 18662 1726867345.98430: Calling groups_inventory to load vars for managed_node2 18662 1726867345.98432: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867345.98437: Calling all_plugins_play to load vars for managed_node2 18662 1726867345.98439: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867345.98446: Calling groups_plugins_play to load vars for managed_node2 18662 1726867345.99505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867346.01226: done with get_vars() 18662 1726867346.01247: done getting variables 18662 1726867346.01302: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:22:26 -0400 (0:00:00.076) 0:00:40.648 ****** 18662 1726867346.01331: entering _queue_task() for managed_node2/set_fact 18662 1726867346.01751: worker is 1 (out of 1 available) 18662 1726867346.01763: exiting _queue_task() for managed_node2/set_fact 18662 1726867346.01775: done queuing things up, now waiting for results queue to drain 18662 1726867346.01779: waiting for pending results... 18662 1726867346.02196: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 18662 1726867346.02201: in run() - task 0affcac9-a3a5-efab-a8ce-000000000502 18662 1726867346.02204: variable 'ansible_search_path' from source: unknown 18662 1726867346.02207: variable 'ansible_search_path' from source: unknown 18662 1726867346.02239: calling self._execute() 18662 1726867346.02345: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867346.02358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867346.02373: variable 'omit' from source: magic vars 18662 1726867346.02755: variable 'ansible_distribution_major_version' from source: facts 18662 1726867346.02773: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867346.02786: variable 'omit' from source: magic vars 18662 1726867346.02839: variable 'omit' from source: magic vars 18662 1726867346.02885: variable 'omit' from source: magic vars 18662 1726867346.02932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867346.02973: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867346.03000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867346.03023: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867346.03038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867346.03068: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867346.03076: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867346.03282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867346.03285: Set connection var ansible_timeout to 10 18662 1726867346.03288: Set connection var ansible_connection to ssh 18662 1726867346.03290: Set connection var ansible_shell_executable to /bin/sh 18662 1726867346.03292: Set connection var ansible_shell_type to sh 18662 1726867346.03294: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867346.03296: Set connection var ansible_pipelining to False 18662 1726867346.03298: variable 'ansible_shell_executable' from source: unknown 18662 1726867346.03299: variable 'ansible_connection' from source: unknown 18662 1726867346.03302: variable 'ansible_module_compression' from source: unknown 18662 1726867346.03303: variable 'ansible_shell_type' from source: unknown 18662 1726867346.03306: variable 'ansible_shell_executable' from source: unknown 18662 1726867346.03307: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867346.03312: variable 'ansible_pipelining' from source: unknown 18662 1726867346.03315: variable 'ansible_timeout' from source: unknown 18662 1726867346.03321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867346.03426: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867346.03446: variable 'omit' from source: magic vars 18662 1726867346.03456: starting attempt loop 18662 1726867346.03463: running the handler 18662 1726867346.03480: handler run complete 18662 1726867346.03494: attempt loop complete, returning result 18662 1726867346.03501: _execute() done 18662 1726867346.03512: dumping result to json 18662 1726867346.03520: done dumping result, returning 18662 1726867346.03532: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-efab-a8ce-000000000502] 18662 1726867346.03546: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000502 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 18662 1726867346.03726: no more pending results, returning what we have 18662 1726867346.03730: results queue empty 18662 1726867346.03731: checking for any_errors_fatal 18662 1726867346.03733: done checking for any_errors_fatal 18662 1726867346.03734: checking for max_fail_percentage 18662 1726867346.03735: done checking for max_fail_percentage 18662 1726867346.03736: checking to see if all hosts have failed and the running result is not ok 18662 1726867346.03737: done checking to see if all hosts have failed 18662 1726867346.03738: getting the remaining hosts for this loop 18662 1726867346.03739: done getting the remaining hosts for this loop 18662 1726867346.03743: getting the next task for host managed_node2 18662 1726867346.03751: done getting next task for host managed_node2 18662 1726867346.03754: ^ task is: TASK: Stat profile file 18662 1726867346.03758: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867346.03762: getting variables 18662 1726867346.03764: in VariableManager get_vars() 18662 1726867346.03795: Calling all_inventory to load vars for managed_node2 18662 1726867346.03798: Calling groups_inventory to load vars for managed_node2 18662 1726867346.03801: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867346.03815: Calling all_plugins_play to load vars for managed_node2 18662 1726867346.03819: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867346.03822: Calling groups_plugins_play to load vars for managed_node2 18662 1726867346.04590: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000502 18662 1726867346.04593: WORKER PROCESS EXITING 18662 1726867346.05650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867346.07229: done with get_vars() 18662 1726867346.07247: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:22:26 -0400 (0:00:00.059) 0:00:40.708 ****** 18662 1726867346.07334: entering _queue_task() for managed_node2/stat 18662 1726867346.07599: worker is 1 (out of 1 available) 18662 1726867346.07614: exiting _queue_task() for managed_node2/stat 18662 1726867346.07625: done queuing things up, now waiting for results queue to drain 18662 1726867346.07626: waiting for pending results... 18662 1726867346.07899: running TaskExecutor() for managed_node2/TASK: Stat profile file 18662 1726867346.08016: in run() - task 0affcac9-a3a5-efab-a8ce-000000000503 18662 1726867346.08035: variable 'ansible_search_path' from source: unknown 18662 1726867346.08041: variable 'ansible_search_path' from source: unknown 18662 1726867346.08075: calling self._execute() 18662 1726867346.08181: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867346.08193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867346.08212: variable 'omit' from source: magic vars 18662 1726867346.08604: variable 'ansible_distribution_major_version' from source: facts 18662 1726867346.08629: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867346.08646: variable 'omit' from source: magic vars 18662 1726867346.08704: variable 'omit' from source: magic vars 18662 1726867346.08802: variable 'profile' from source: include params 18662 1726867346.08816: variable 'interface' from source: set_fact 18662 1726867346.08896: variable 'interface' from source: set_fact 18662 1726867346.08924: variable 'omit' from source: magic vars 18662 1726867346.08971: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867346.09014: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867346.09040: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867346.09063: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867346.09086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867346.09124: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867346.09134: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867346.09142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867346.09252: Set connection var ansible_timeout to 10 18662 1726867346.09273: Set connection var ansible_connection to ssh 18662 1726867346.09292: Set connection var ansible_shell_executable to /bin/sh 18662 1726867346.09383: Set connection var ansible_shell_type to sh 18662 1726867346.09387: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867346.09390: Set connection var ansible_pipelining to False 18662 1726867346.09392: variable 'ansible_shell_executable' from source: unknown 18662 1726867346.09396: variable 'ansible_connection' from source: unknown 18662 1726867346.09398: variable 'ansible_module_compression' from source: unknown 18662 1726867346.09400: variable 'ansible_shell_type' from source: unknown 18662 1726867346.09402: variable 'ansible_shell_executable' from source: unknown 18662 1726867346.09404: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867346.09406: variable 'ansible_pipelining' from source: unknown 18662 1726867346.09408: variable 'ansible_timeout' from source: unknown 18662 1726867346.09413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867346.09883: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867346.09887: variable 'omit' from source: magic vars 18662 1726867346.09890: starting attempt loop 18662 1726867346.09893: running the handler 18662 1726867346.09895: _low_level_execute_command(): starting 18662 1726867346.09897: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867346.11151: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867346.11168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867346.11211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867346.11358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867346.11434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867346.11515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867346.13256: stdout chunk (state=3): >>>/root <<< 18662 1726867346.13381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867346.13387: stdout chunk (state=3): >>><<< 18662 1726867346.13396: stderr chunk (state=3): >>><<< 18662 1726867346.13488: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867346.13491: _low_level_execute_command(): starting 18662 1726867346.13494: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867346.1346474-20587-205223257410229 `" && echo ansible-tmp-1726867346.1346474-20587-205223257410229="` echo /root/.ansible/tmp/ansible-tmp-1726867346.1346474-20587-205223257410229 `" ) && sleep 0' 18662 1726867346.14101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867346.14109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867346.14124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867346.14176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867346.14244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867346.14287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867346.14419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867346.16396: stdout chunk (state=3): >>>ansible-tmp-1726867346.1346474-20587-205223257410229=/root/.ansible/tmp/ansible-tmp-1726867346.1346474-20587-205223257410229 <<< 18662 1726867346.16582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867346.16585: stdout chunk (state=3): >>><<< 18662 1726867346.16587: stderr chunk (state=3): >>><<< 18662 1726867346.16698: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867346.1346474-20587-205223257410229=/root/.ansible/tmp/ansible-tmp-1726867346.1346474-20587-205223257410229 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867346.16742: variable 'ansible_module_compression' from source: unknown 18662 1726867346.16882: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 18662 1726867346.16890: variable 'ansible_facts' from source: unknown 18662 1726867346.17159: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867346.1346474-20587-205223257410229/AnsiballZ_stat.py 18662 1726867346.18098: Sending initial data 18662 1726867346.18101: Sent initial data (153 bytes) 18662 1726867346.19053: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867346.19057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867346.19059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867346.19061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867346.19063: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867346.19065: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867346.19067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867346.19069: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18662 1726867346.19071: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 18662 1726867346.19486: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867346.19595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867346.19825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867346.21446: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18662 1726867346.21455: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 18662 1726867346.21462: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 18662 1726867346.21470: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 18662 1726867346.21485: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867346.21607: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867346.21642: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmp62ocf8im /root/.ansible/tmp/ansible-tmp-1726867346.1346474-20587-205223257410229/AnsiballZ_stat.py <<< 18662 1726867346.21646: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867346.1346474-20587-205223257410229/AnsiballZ_stat.py" <<< 18662 1726867346.21702: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmp62ocf8im" to remote "/root/.ansible/tmp/ansible-tmp-1726867346.1346474-20587-205223257410229/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867346.1346474-20587-205223257410229/AnsiballZ_stat.py" <<< 18662 1726867346.23163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867346.23167: stdout chunk (state=3): >>><<< 18662 1726867346.23174: stderr chunk (state=3): >>><<< 18662 1726867346.23282: done transferring module to remote 18662 1726867346.23285: _low_level_execute_command(): starting 18662 1726867346.23288: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867346.1346474-20587-205223257410229/ /root/.ansible/tmp/ansible-tmp-1726867346.1346474-20587-205223257410229/AnsiballZ_stat.py && sleep 0' 18662 1726867346.24594: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867346.24719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867346.24723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867346.24725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867346.24727: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867346.24730: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867346.24974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867346.25000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867346.26930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867346.26934: stderr chunk (state=3): >>><<< 18662 1726867346.26943: stdout chunk (state=3): >>><<< 18662 1726867346.26965: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867346.26968: _low_level_execute_command(): starting 18662 1726867346.26973: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867346.1346474-20587-205223257410229/AnsiballZ_stat.py && sleep 0' 18662 1726867346.28085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867346.28094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867346.28105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867346.28121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867346.28134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867346.28140: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867346.28154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867346.28164: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18662 1726867346.28587: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867346.28590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867346.28592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867346.44503: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 18662 1726867346.45943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867346.45947: stdout chunk (state=3): >>><<< 18662 1726867346.45982: stderr chunk (state=3): >>><<< 18662 1726867346.45987: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867346.46071: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867346.1346474-20587-205223257410229/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867346.46079: _low_level_execute_command(): starting 18662 1726867346.46092: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867346.1346474-20587-205223257410229/ > /dev/null 2>&1 && sleep 0' 18662 1726867346.47182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867346.47409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867346.47413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867346.47415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867346.47416: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867346.47669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867346.47672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867346.49570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867346.49574: stderr chunk (state=3): >>><<< 18662 1726867346.49578: stdout chunk (state=3): >>><<< 18662 1726867346.49597: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867346.49605: handler run complete 18662 1726867346.49630: attempt loop complete, returning result 18662 1726867346.49633: _execute() done 18662 1726867346.49636: dumping result to json 18662 1726867346.49638: done dumping result, returning 18662 1726867346.49647: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0affcac9-a3a5-efab-a8ce-000000000503] 18662 1726867346.49652: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000503 18662 1726867346.49755: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000503 18662 1726867346.49758: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 18662 1726867346.49814: no more pending results, returning what we have 18662 1726867346.49818: results queue empty 18662 1726867346.49819: checking for any_errors_fatal 18662 1726867346.49828: done checking for any_errors_fatal 18662 1726867346.49829: checking for max_fail_percentage 18662 1726867346.49831: done checking for max_fail_percentage 18662 1726867346.49831: checking to see if all hosts have failed and the running result is not ok 18662 1726867346.49832: done checking to see if all hosts have failed 18662 1726867346.49833: getting the remaining hosts for this loop 18662 1726867346.49834: done getting the remaining hosts for this loop 18662 1726867346.49837: getting the next task for host managed_node2 18662 1726867346.49845: done getting next task for host managed_node2 18662 1726867346.49847: ^ task is: TASK: Set NM profile exist flag based on the profile files 18662 1726867346.49851: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867346.49855: getting variables 18662 1726867346.49856: in VariableManager get_vars() 18662 1726867346.49888: Calling all_inventory to load vars for managed_node2 18662 1726867346.49891: Calling groups_inventory to load vars for managed_node2 18662 1726867346.49894: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867346.49904: Calling all_plugins_play to load vars for managed_node2 18662 1726867346.49907: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867346.49909: Calling groups_plugins_play to load vars for managed_node2 18662 1726867346.53012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867346.67399: done with get_vars() 18662 1726867346.67447: done getting variables 18662 1726867346.67693: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:22:26 -0400 (0:00:00.603) 0:00:41.312 ****** 18662 1726867346.67725: entering _queue_task() for managed_node2/set_fact 18662 1726867346.68307: worker is 1 (out of 1 available) 18662 1726867346.68322: exiting _queue_task() for managed_node2/set_fact 18662 1726867346.68334: done queuing things up, now waiting for results queue to drain 18662 1726867346.68336: waiting for pending results... 18662 1726867346.68996: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 18662 1726867346.69244: in run() - task 0affcac9-a3a5-efab-a8ce-000000000504 18662 1726867346.69248: variable 'ansible_search_path' from source: unknown 18662 1726867346.69251: variable 'ansible_search_path' from source: unknown 18662 1726867346.69255: calling self._execute() 18662 1726867346.69495: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867346.69499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867346.69573: variable 'omit' from source: magic vars 18662 1726867346.70389: variable 'ansible_distribution_major_version' from source: facts 18662 1726867346.70587: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867346.70684: variable 'profile_stat' from source: set_fact 18662 1726867346.70760: Evaluated conditional (profile_stat.stat.exists): False 18662 1726867346.70809: when evaluation is False, skipping this task 18662 1726867346.70816: _execute() done 18662 1726867346.70824: dumping result to json 18662 1726867346.70830: done dumping result, returning 18662 1726867346.70839: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-efab-a8ce-000000000504] 18662 1726867346.70961: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000504 18662 1726867346.71098: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000504 18662 1726867346.71102: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18662 1726867346.71152: no more pending results, returning what we have 18662 1726867346.71155: results queue empty 18662 1726867346.71156: checking for any_errors_fatal 18662 1726867346.71165: done checking for any_errors_fatal 18662 1726867346.71166: checking for max_fail_percentage 18662 1726867346.71168: done checking for max_fail_percentage 18662 1726867346.71169: checking to see if all hosts have failed and the running result is not ok 18662 1726867346.71170: done checking to see if all hosts have failed 18662 1726867346.71170: getting the remaining hosts for this loop 18662 1726867346.71171: done getting the remaining hosts for this loop 18662 1726867346.71175: getting the next task for host managed_node2 18662 1726867346.71184: done getting next task for host managed_node2 18662 1726867346.71187: ^ task is: TASK: Get NM profile info 18662 1726867346.71192: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867346.71196: getting variables 18662 1726867346.71198: in VariableManager get_vars() 18662 1726867346.71233: Calling all_inventory to load vars for managed_node2 18662 1726867346.71235: Calling groups_inventory to load vars for managed_node2 18662 1726867346.71239: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867346.71251: Calling all_plugins_play to load vars for managed_node2 18662 1726867346.71255: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867346.71257: Calling groups_plugins_play to load vars for managed_node2 18662 1726867346.73916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867346.77333: done with get_vars() 18662 1726867346.77364: done getting variables 18662 1726867346.77631: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:22:26 -0400 (0:00:00.099) 0:00:41.412 ****** 18662 1726867346.77666: entering _queue_task() for managed_node2/shell 18662 1726867346.77668: Creating lock for shell 18662 1726867346.78070: worker is 1 (out of 1 available) 18662 1726867346.78086: exiting _queue_task() for managed_node2/shell 18662 1726867346.78099: done queuing things up, now waiting for results queue to drain 18662 1726867346.78100: waiting for pending results... 18662 1726867346.78494: running TaskExecutor() for managed_node2/TASK: Get NM profile info 18662 1726867346.78499: in run() - task 0affcac9-a3a5-efab-a8ce-000000000505 18662 1726867346.78511: variable 'ansible_search_path' from source: unknown 18662 1726867346.78518: variable 'ansible_search_path' from source: unknown 18662 1726867346.78558: calling self._execute() 18662 1726867346.78663: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867346.78679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867346.78696: variable 'omit' from source: magic vars 18662 1726867346.79097: variable 'ansible_distribution_major_version' from source: facts 18662 1726867346.79117: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867346.79130: variable 'omit' from source: magic vars 18662 1726867346.79185: variable 'omit' from source: magic vars 18662 1726867346.79319: variable 'profile' from source: include params 18662 1726867346.79330: variable 'interface' from source: set_fact 18662 1726867346.79483: variable 'interface' from source: set_fact 18662 1726867346.79487: variable 'omit' from source: magic vars 18662 1726867346.79531: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867346.79570: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867346.79983: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867346.79988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867346.79992: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867346.79995: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867346.79997: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867346.80000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867346.80002: Set connection var ansible_timeout to 10 18662 1726867346.80005: Set connection var ansible_connection to ssh 18662 1726867346.80007: Set connection var ansible_shell_executable to /bin/sh 18662 1726867346.80092: Set connection var ansible_shell_type to sh 18662 1726867346.80111: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867346.80125: Set connection var ansible_pipelining to False 18662 1726867346.80156: variable 'ansible_shell_executable' from source: unknown 18662 1726867346.80164: variable 'ansible_connection' from source: unknown 18662 1726867346.80382: variable 'ansible_module_compression' from source: unknown 18662 1726867346.80385: variable 'ansible_shell_type' from source: unknown 18662 1726867346.80387: variable 'ansible_shell_executable' from source: unknown 18662 1726867346.80389: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867346.80391: variable 'ansible_pipelining' from source: unknown 18662 1726867346.80394: variable 'ansible_timeout' from source: unknown 18662 1726867346.80396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867346.80463: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867346.80782: variable 'omit' from source: magic vars 18662 1726867346.80786: starting attempt loop 18662 1726867346.80788: running the handler 18662 1726867346.80790: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867346.80792: _low_level_execute_command(): starting 18662 1726867346.80794: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867346.81626: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867346.81658: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867346.81729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867346.83456: stdout chunk (state=3): >>>/root <<< 18662 1726867346.83600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867346.83604: stdout chunk (state=3): >>><<< 18662 1726867346.83606: stderr chunk (state=3): >>><<< 18662 1726867346.83629: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867346.83649: _low_level_execute_command(): starting 18662 1726867346.83678: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867346.836354-20626-211594289343683 `" && echo ansible-tmp-1726867346.836354-20626-211594289343683="` echo /root/.ansible/tmp/ansible-tmp-1726867346.836354-20626-211594289343683 `" ) && sleep 0' 18662 1726867346.84305: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867346.84325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867346.84353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867346.84465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867346.84492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867346.84514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867346.84540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867346.84611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867346.86834: stdout chunk (state=3): >>>ansible-tmp-1726867346.836354-20626-211594289343683=/root/.ansible/tmp/ansible-tmp-1726867346.836354-20626-211594289343683 <<< 18662 1726867346.87085: stdout chunk (state=3): >>><<< 18662 1726867346.87089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867346.87091: stderr chunk (state=3): >>><<< 18662 1726867346.87094: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867346.836354-20626-211594289343683=/root/.ansible/tmp/ansible-tmp-1726867346.836354-20626-211594289343683 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867346.87097: variable 'ansible_module_compression' from source: unknown 18662 1726867346.87099: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18662 1726867346.87102: variable 'ansible_facts' from source: unknown 18662 1726867346.87237: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867346.836354-20626-211594289343683/AnsiballZ_command.py 18662 1726867346.87672: Sending initial data 18662 1726867346.87676: Sent initial data (155 bytes) 18662 1726867346.88447: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867346.88490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867346.88508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867346.88533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867346.88630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867346.90296: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867346.90352: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867346.90443: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpkyd6_40d /root/.ansible/tmp/ansible-tmp-1726867346.836354-20626-211594289343683/AnsiballZ_command.py <<< 18662 1726867346.90447: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867346.836354-20626-211594289343683/AnsiballZ_command.py" <<< 18662 1726867346.90481: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpkyd6_40d" to remote "/root/.ansible/tmp/ansible-tmp-1726867346.836354-20626-211594289343683/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867346.836354-20626-211594289343683/AnsiballZ_command.py" <<< 18662 1726867346.91207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867346.91211: stdout chunk (state=3): >>><<< 18662 1726867346.91382: stderr chunk (state=3): >>><<< 18662 1726867346.91386: done transferring module to remote 18662 1726867346.91389: _low_level_execute_command(): starting 18662 1726867346.91391: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867346.836354-20626-211594289343683/ /root/.ansible/tmp/ansible-tmp-1726867346.836354-20626-211594289343683/AnsiballZ_command.py && sleep 0' 18662 1726867346.92096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867346.92148: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867346.92210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867346.92231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867346.92243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867346.92512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867346.94421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867346.94424: stdout chunk (state=3): >>><<< 18662 1726867346.94434: stderr chunk (state=3): >>><<< 18662 1726867346.94451: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867346.94455: _low_level_execute_command(): starting 18662 1726867346.94584: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867346.836354-20626-211594289343683/AnsiballZ_command.py && sleep 0' 18662 1726867346.95148: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867346.95158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867346.95197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867346.95204: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18662 1726867346.95261: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867346.95309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867346.95330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867346.95378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867346.95425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867347.13007: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-20 17:22:27.112130", "end": "2024-09-20 17:22:27.128851", "delta": "0:00:00.016721", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18662 1726867347.14649: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.12.116 closed. <<< 18662 1726867347.14671: stderr chunk (state=3): >>><<< 18662 1726867347.14675: stdout chunk (state=3): >>><<< 18662 1726867347.14698: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-20 17:22:27.112130", "end": "2024-09-20 17:22:27.128851", "delta": "0:00:00.016721", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.12.116 closed. 18662 1726867347.14758: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867346.836354-20626-211594289343683/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867347.14790: _low_level_execute_command(): starting 18662 1726867347.14794: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867346.836354-20626-211594289343683/ > /dev/null 2>&1 && sleep 0' 18662 1726867347.15388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867347.15395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867347.15453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867347.15459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867347.15461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867347.15505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867347.17413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867347.17442: stderr chunk (state=3): >>><<< 18662 1726867347.17445: stdout chunk (state=3): >>><<< 18662 1726867347.17461: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867347.17467: handler run complete 18662 1726867347.17486: Evaluated conditional (False): False 18662 1726867347.17495: attempt loop complete, returning result 18662 1726867347.17498: _execute() done 18662 1726867347.17500: dumping result to json 18662 1726867347.17502: done dumping result, returning 18662 1726867347.17513: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0affcac9-a3a5-efab-a8ce-000000000505] 18662 1726867347.17516: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000505 18662 1726867347.17616: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000505 18662 1726867347.17619: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "delta": "0:00:00.016721", "end": "2024-09-20 17:22:27.128851", "rc": 1, "start": "2024-09-20 17:22:27.112130" } MSG: non-zero return code ...ignoring 18662 1726867347.17700: no more pending results, returning what we have 18662 1726867347.17703: results queue empty 18662 1726867347.17704: checking for any_errors_fatal 18662 1726867347.17712: done checking for any_errors_fatal 18662 1726867347.17713: checking for max_fail_percentage 18662 1726867347.17714: done checking for max_fail_percentage 18662 1726867347.17715: checking to see if all hosts have failed and the running result is not ok 18662 1726867347.17716: done checking to see if all hosts have failed 18662 1726867347.17716: getting the remaining hosts for this loop 18662 1726867347.17718: done getting the remaining hosts for this loop 18662 1726867347.17721: getting the next task for host managed_node2 18662 1726867347.17728: done getting next task for host managed_node2 18662 1726867347.17731: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 18662 1726867347.17735: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867347.17739: getting variables 18662 1726867347.17741: in VariableManager get_vars() 18662 1726867347.17770: Calling all_inventory to load vars for managed_node2 18662 1726867347.17773: Calling groups_inventory to load vars for managed_node2 18662 1726867347.17776: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867347.17825: Calling all_plugins_play to load vars for managed_node2 18662 1726867347.17829: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867347.17833: Calling groups_plugins_play to load vars for managed_node2 18662 1726867347.18957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867347.20023: done with get_vars() 18662 1726867347.20037: done getting variables 18662 1726867347.20100: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:22:27 -0400 (0:00:00.424) 0:00:41.836 ****** 18662 1726867347.20123: entering _queue_task() for managed_node2/set_fact 18662 1726867347.20387: worker is 1 (out of 1 available) 18662 1726867347.20398: exiting _queue_task() for managed_node2/set_fact 18662 1726867347.20410: done queuing things up, now waiting for results queue to drain 18662 1726867347.20411: waiting for pending results... 18662 1726867347.20609: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 18662 1726867347.20730: in run() - task 0affcac9-a3a5-efab-a8ce-000000000506 18662 1726867347.20749: variable 'ansible_search_path' from source: unknown 18662 1726867347.20796: variable 'ansible_search_path' from source: unknown 18662 1726867347.20810: calling self._execute() 18662 1726867347.20860: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867347.20887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867347.20897: variable 'omit' from source: magic vars 18662 1726867347.21234: variable 'ansible_distribution_major_version' from source: facts 18662 1726867347.21238: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867347.21330: variable 'nm_profile_exists' from source: set_fact 18662 1726867347.21341: Evaluated conditional (nm_profile_exists.rc == 0): False 18662 1726867347.21344: when evaluation is False, skipping this task 18662 1726867347.21349: _execute() done 18662 1726867347.21352: dumping result to json 18662 1726867347.21356: done dumping result, returning 18662 1726867347.21359: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-efab-a8ce-000000000506] 18662 1726867347.21362: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000506 18662 1726867347.21446: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000506 18662 1726867347.21449: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 18662 1726867347.21518: no more pending results, returning what we have 18662 1726867347.21522: results queue empty 18662 1726867347.21523: checking for any_errors_fatal 18662 1726867347.21529: done checking for any_errors_fatal 18662 1726867347.21530: checking for max_fail_percentage 18662 1726867347.21531: done checking for max_fail_percentage 18662 1726867347.21532: checking to see if all hosts have failed and the running result is not ok 18662 1726867347.21533: done checking to see if all hosts have failed 18662 1726867347.21533: getting the remaining hosts for this loop 18662 1726867347.21535: done getting the remaining hosts for this loop 18662 1726867347.21538: getting the next task for host managed_node2 18662 1726867347.21545: done getting next task for host managed_node2 18662 1726867347.21547: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 18662 1726867347.21551: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867347.21554: getting variables 18662 1726867347.21555: in VariableManager get_vars() 18662 1726867347.21580: Calling all_inventory to load vars for managed_node2 18662 1726867347.21582: Calling groups_inventory to load vars for managed_node2 18662 1726867347.21585: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867347.21594: Calling all_plugins_play to load vars for managed_node2 18662 1726867347.21596: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867347.21599: Calling groups_plugins_play to load vars for managed_node2 18662 1726867347.22364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867347.23272: done with get_vars() 18662 1726867347.23289: done getting variables 18662 1726867347.23329: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18662 1726867347.23414: variable 'profile' from source: include params 18662 1726867347.23417: variable 'interface' from source: set_fact 18662 1726867347.23458: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-lsr27] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:22:27 -0400 (0:00:00.033) 0:00:41.870 ****** 18662 1726867347.23482: entering _queue_task() for managed_node2/command 18662 1726867347.23664: worker is 1 (out of 1 available) 18662 1726867347.23675: exiting _queue_task() for managed_node2/command 18662 1726867347.23688: done queuing things up, now waiting for results queue to drain 18662 1726867347.23689: waiting for pending results... 18662 1726867347.23855: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-lsr27 18662 1726867347.23936: in run() - task 0affcac9-a3a5-efab-a8ce-000000000508 18662 1726867347.23947: variable 'ansible_search_path' from source: unknown 18662 1726867347.23951: variable 'ansible_search_path' from source: unknown 18662 1726867347.23979: calling self._execute() 18662 1726867347.24044: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867347.24048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867347.24058: variable 'omit' from source: magic vars 18662 1726867347.24334: variable 'ansible_distribution_major_version' from source: facts 18662 1726867347.24343: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867347.24431: variable 'profile_stat' from source: set_fact 18662 1726867347.24441: Evaluated conditional (profile_stat.stat.exists): False 18662 1726867347.24445: when evaluation is False, skipping this task 18662 1726867347.24448: _execute() done 18662 1726867347.24451: dumping result to json 18662 1726867347.24456: done dumping result, returning 18662 1726867347.24459: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-lsr27 [0affcac9-a3a5-efab-a8ce-000000000508] 18662 1726867347.24463: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000508 18662 1726867347.24545: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000508 18662 1726867347.24548: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18662 1726867347.24625: no more pending results, returning what we have 18662 1726867347.24628: results queue empty 18662 1726867347.24628: checking for any_errors_fatal 18662 1726867347.24633: done checking for any_errors_fatal 18662 1726867347.24634: checking for max_fail_percentage 18662 1726867347.24636: done checking for max_fail_percentage 18662 1726867347.24636: checking to see if all hosts have failed and the running result is not ok 18662 1726867347.24637: done checking to see if all hosts have failed 18662 1726867347.24638: getting the remaining hosts for this loop 18662 1726867347.24639: done getting the remaining hosts for this loop 18662 1726867347.24641: getting the next task for host managed_node2 18662 1726867347.24647: done getting next task for host managed_node2 18662 1726867347.24649: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 18662 1726867347.24652: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867347.24655: getting variables 18662 1726867347.24656: in VariableManager get_vars() 18662 1726867347.24680: Calling all_inventory to load vars for managed_node2 18662 1726867347.24683: Calling groups_inventory to load vars for managed_node2 18662 1726867347.24685: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867347.24695: Calling all_plugins_play to load vars for managed_node2 18662 1726867347.24697: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867347.24698: Calling groups_plugins_play to load vars for managed_node2 18662 1726867347.25655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867347.26852: done with get_vars() 18662 1726867347.26866: done getting variables 18662 1726867347.26907: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18662 1726867347.27013: variable 'profile' from source: include params 18662 1726867347.27017: variable 'interface' from source: set_fact 18662 1726867347.27072: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-lsr27] *********************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:22:27 -0400 (0:00:00.036) 0:00:41.906 ****** 18662 1726867347.27120: entering _queue_task() for managed_node2/set_fact 18662 1726867347.27352: worker is 1 (out of 1 available) 18662 1726867347.27363: exiting _queue_task() for managed_node2/set_fact 18662 1726867347.27373: done queuing things up, now waiting for results queue to drain 18662 1726867347.27375: waiting for pending results... 18662 1726867347.27568: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-lsr27 18662 1726867347.27671: in run() - task 0affcac9-a3a5-efab-a8ce-000000000509 18662 1726867347.27685: variable 'ansible_search_path' from source: unknown 18662 1726867347.27688: variable 'ansible_search_path' from source: unknown 18662 1726867347.27723: calling self._execute() 18662 1726867347.27781: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867347.27786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867347.27794: variable 'omit' from source: magic vars 18662 1726867347.28080: variable 'ansible_distribution_major_version' from source: facts 18662 1726867347.28088: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867347.28183: variable 'profile_stat' from source: set_fact 18662 1726867347.28199: Evaluated conditional (profile_stat.stat.exists): False 18662 1726867347.28206: when evaluation is False, skipping this task 18662 1726867347.28210: _execute() done 18662 1726867347.28213: dumping result to json 18662 1726867347.28215: done dumping result, returning 18662 1726867347.28218: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-lsr27 [0affcac9-a3a5-efab-a8ce-000000000509] 18662 1726867347.28220: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000509 18662 1726867347.28363: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000509 18662 1726867347.28366: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18662 1726867347.28418: no more pending results, returning what we have 18662 1726867347.28421: results queue empty 18662 1726867347.28422: checking for any_errors_fatal 18662 1726867347.28425: done checking for any_errors_fatal 18662 1726867347.28426: checking for max_fail_percentage 18662 1726867347.28427: done checking for max_fail_percentage 18662 1726867347.28428: checking to see if all hosts have failed and the running result is not ok 18662 1726867347.28429: done checking to see if all hosts have failed 18662 1726867347.28429: getting the remaining hosts for this loop 18662 1726867347.28430: done getting the remaining hosts for this loop 18662 1726867347.28437: getting the next task for host managed_node2 18662 1726867347.28443: done getting next task for host managed_node2 18662 1726867347.28445: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 18662 1726867347.28449: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867347.28452: getting variables 18662 1726867347.28454: in VariableManager get_vars() 18662 1726867347.28480: Calling all_inventory to load vars for managed_node2 18662 1726867347.28483: Calling groups_inventory to load vars for managed_node2 18662 1726867347.28486: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867347.28494: Calling all_plugins_play to load vars for managed_node2 18662 1726867347.28496: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867347.28498: Calling groups_plugins_play to load vars for managed_node2 18662 1726867347.29301: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867347.30321: done with get_vars() 18662 1726867347.30337: done getting variables 18662 1726867347.30375: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18662 1726867347.30460: variable 'profile' from source: include params 18662 1726867347.30463: variable 'interface' from source: set_fact 18662 1726867347.30501: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-lsr27] ****************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:22:27 -0400 (0:00:00.034) 0:00:41.940 ****** 18662 1726867347.30521: entering _queue_task() for managed_node2/command 18662 1726867347.30697: worker is 1 (out of 1 available) 18662 1726867347.30709: exiting _queue_task() for managed_node2/command 18662 1726867347.30720: done queuing things up, now waiting for results queue to drain 18662 1726867347.30721: waiting for pending results... 18662 1726867347.30871: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-lsr27 18662 1726867347.30956: in run() - task 0affcac9-a3a5-efab-a8ce-00000000050a 18662 1726867347.30967: variable 'ansible_search_path' from source: unknown 18662 1726867347.30971: variable 'ansible_search_path' from source: unknown 18662 1726867347.30998: calling self._execute() 18662 1726867347.31062: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867347.31068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867347.31076: variable 'omit' from source: magic vars 18662 1726867347.31391: variable 'ansible_distribution_major_version' from source: facts 18662 1726867347.31408: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867347.31591: variable 'profile_stat' from source: set_fact 18662 1726867347.31594: Evaluated conditional (profile_stat.stat.exists): False 18662 1726867347.31597: when evaluation is False, skipping this task 18662 1726867347.31599: _execute() done 18662 1726867347.31601: dumping result to json 18662 1726867347.31603: done dumping result, returning 18662 1726867347.31605: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-lsr27 [0affcac9-a3a5-efab-a8ce-00000000050a] 18662 1726867347.31606: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000050a 18662 1726867347.31670: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000050a 18662 1726867347.31673: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18662 1726867347.31724: no more pending results, returning what we have 18662 1726867347.31727: results queue empty 18662 1726867347.31728: checking for any_errors_fatal 18662 1726867347.31733: done checking for any_errors_fatal 18662 1726867347.31734: checking for max_fail_percentage 18662 1726867347.31736: done checking for max_fail_percentage 18662 1726867347.31736: checking to see if all hosts have failed and the running result is not ok 18662 1726867347.31737: done checking to see if all hosts have failed 18662 1726867347.31738: getting the remaining hosts for this loop 18662 1726867347.31739: done getting the remaining hosts for this loop 18662 1726867347.31742: getting the next task for host managed_node2 18662 1726867347.31747: done getting next task for host managed_node2 18662 1726867347.31749: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 18662 1726867347.31752: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867347.31755: getting variables 18662 1726867347.31756: in VariableManager get_vars() 18662 1726867347.31780: Calling all_inventory to load vars for managed_node2 18662 1726867347.31782: Calling groups_inventory to load vars for managed_node2 18662 1726867347.31785: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867347.31795: Calling all_plugins_play to load vars for managed_node2 18662 1726867347.31798: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867347.31801: Calling groups_plugins_play to load vars for managed_node2 18662 1726867347.32790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867347.33670: done with get_vars() 18662 1726867347.33685: done getting variables 18662 1726867347.33723: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18662 1726867347.33795: variable 'profile' from source: include params 18662 1726867347.33798: variable 'interface' from source: set_fact 18662 1726867347.33835: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-lsr27] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:22:27 -0400 (0:00:00.033) 0:00:41.974 ****** 18662 1726867347.33857: entering _queue_task() for managed_node2/set_fact 18662 1726867347.34078: worker is 1 (out of 1 available) 18662 1726867347.34091: exiting _queue_task() for managed_node2/set_fact 18662 1726867347.34103: done queuing things up, now waiting for results queue to drain 18662 1726867347.34104: waiting for pending results... 18662 1726867347.34350: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-lsr27 18662 1726867347.34387: in run() - task 0affcac9-a3a5-efab-a8ce-00000000050b 18662 1726867347.34399: variable 'ansible_search_path' from source: unknown 18662 1726867347.34403: variable 'ansible_search_path' from source: unknown 18662 1726867347.34430: calling self._execute() 18662 1726867347.34499: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867347.34503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867347.34513: variable 'omit' from source: magic vars 18662 1726867347.34843: variable 'ansible_distribution_major_version' from source: facts 18662 1726867347.34851: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867347.34949: variable 'profile_stat' from source: set_fact 18662 1726867347.34961: Evaluated conditional (profile_stat.stat.exists): False 18662 1726867347.34964: when evaluation is False, skipping this task 18662 1726867347.34967: _execute() done 18662 1726867347.34970: dumping result to json 18662 1726867347.34973: done dumping result, returning 18662 1726867347.34979: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-lsr27 [0affcac9-a3a5-efab-a8ce-00000000050b] 18662 1726867347.35003: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000050b 18662 1726867347.35088: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000050b 18662 1726867347.35093: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 18662 1726867347.35146: no more pending results, returning what we have 18662 1726867347.35149: results queue empty 18662 1726867347.35150: checking for any_errors_fatal 18662 1726867347.35155: done checking for any_errors_fatal 18662 1726867347.35155: checking for max_fail_percentage 18662 1726867347.35157: done checking for max_fail_percentage 18662 1726867347.35158: checking to see if all hosts have failed and the running result is not ok 18662 1726867347.35158: done checking to see if all hosts have failed 18662 1726867347.35159: getting the remaining hosts for this loop 18662 1726867347.35160: done getting the remaining hosts for this loop 18662 1726867347.35163: getting the next task for host managed_node2 18662 1726867347.35169: done getting next task for host managed_node2 18662 1726867347.35171: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 18662 1726867347.35174: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867347.35180: getting variables 18662 1726867347.35181: in VariableManager get_vars() 18662 1726867347.35205: Calling all_inventory to load vars for managed_node2 18662 1726867347.35207: Calling groups_inventory to load vars for managed_node2 18662 1726867347.35212: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867347.35223: Calling all_plugins_play to load vars for managed_node2 18662 1726867347.35230: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867347.35234: Calling groups_plugins_play to load vars for managed_node2 18662 1726867347.36163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867347.37417: done with get_vars() 18662 1726867347.37450: done getting variables 18662 1726867347.37522: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18662 1726867347.37646: variable 'profile' from source: include params 18662 1726867347.37651: variable 'interface' from source: set_fact 18662 1726867347.37726: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'lsr27'] ***************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 17:22:27 -0400 (0:00:00.039) 0:00:42.013 ****** 18662 1726867347.37764: entering _queue_task() for managed_node2/assert 18662 1726867347.38023: worker is 1 (out of 1 available) 18662 1726867347.38035: exiting _queue_task() for managed_node2/assert 18662 1726867347.38046: done queuing things up, now waiting for results queue to drain 18662 1726867347.38048: waiting for pending results... 18662 1726867347.38262: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'lsr27' 18662 1726867347.38332: in run() - task 0affcac9-a3a5-efab-a8ce-0000000004f6 18662 1726867347.38344: variable 'ansible_search_path' from source: unknown 18662 1726867347.38348: variable 'ansible_search_path' from source: unknown 18662 1726867347.38374: calling self._execute() 18662 1726867347.38462: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867347.38467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867347.38476: variable 'omit' from source: magic vars 18662 1726867347.38895: variable 'ansible_distribution_major_version' from source: facts 18662 1726867347.38899: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867347.38901: variable 'omit' from source: magic vars 18662 1726867347.38945: variable 'omit' from source: magic vars 18662 1726867347.39028: variable 'profile' from source: include params 18662 1726867347.39032: variable 'interface' from source: set_fact 18662 1726867347.39106: variable 'interface' from source: set_fact 18662 1726867347.39109: variable 'omit' from source: magic vars 18662 1726867347.39158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867347.39187: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867347.39206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867347.39221: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867347.39234: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867347.39269: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867347.39272: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867347.39275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867347.39351: Set connection var ansible_timeout to 10 18662 1726867347.39354: Set connection var ansible_connection to ssh 18662 1726867347.39358: Set connection var ansible_shell_executable to /bin/sh 18662 1726867347.39360: Set connection var ansible_shell_type to sh 18662 1726867347.39368: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867347.39373: Set connection var ansible_pipelining to False 18662 1726867347.39392: variable 'ansible_shell_executable' from source: unknown 18662 1726867347.39408: variable 'ansible_connection' from source: unknown 18662 1726867347.39411: variable 'ansible_module_compression' from source: unknown 18662 1726867347.39416: variable 'ansible_shell_type' from source: unknown 18662 1726867347.39418: variable 'ansible_shell_executable' from source: unknown 18662 1726867347.39420: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867347.39422: variable 'ansible_pipelining' from source: unknown 18662 1726867347.39425: variable 'ansible_timeout' from source: unknown 18662 1726867347.39427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867347.39521: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867347.39533: variable 'omit' from source: magic vars 18662 1726867347.39538: starting attempt loop 18662 1726867347.39541: running the handler 18662 1726867347.39641: variable 'lsr_net_profile_exists' from source: set_fact 18662 1726867347.39644: Evaluated conditional (not lsr_net_profile_exists): True 18662 1726867347.39650: handler run complete 18662 1726867347.39662: attempt loop complete, returning result 18662 1726867347.39671: _execute() done 18662 1726867347.39674: dumping result to json 18662 1726867347.39678: done dumping result, returning 18662 1726867347.39684: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'lsr27' [0affcac9-a3a5-efab-a8ce-0000000004f6] 18662 1726867347.39689: sending task result for task 0affcac9-a3a5-efab-a8ce-0000000004f6 18662 1726867347.39761: done sending task result for task 0affcac9-a3a5-efab-a8ce-0000000004f6 18662 1726867347.39764: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 18662 1726867347.39819: no more pending results, returning what we have 18662 1726867347.39833: results queue empty 18662 1726867347.39834: checking for any_errors_fatal 18662 1726867347.39843: done checking for any_errors_fatal 18662 1726867347.39844: checking for max_fail_percentage 18662 1726867347.39846: done checking for max_fail_percentage 18662 1726867347.39847: checking to see if all hosts have failed and the running result is not ok 18662 1726867347.39848: done checking to see if all hosts have failed 18662 1726867347.39848: getting the remaining hosts for this loop 18662 1726867347.39850: done getting the remaining hosts for this loop 18662 1726867347.39852: getting the next task for host managed_node2 18662 1726867347.39860: done getting next task for host managed_node2 18662 1726867347.39863: ^ task is: TASK: Include the task 'assert_device_absent.yml' 18662 1726867347.39865: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867347.39871: getting variables 18662 1726867347.39880: in VariableManager get_vars() 18662 1726867347.39907: Calling all_inventory to load vars for managed_node2 18662 1726867347.39909: Calling groups_inventory to load vars for managed_node2 18662 1726867347.39912: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867347.39924: Calling all_plugins_play to load vars for managed_node2 18662 1726867347.39927: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867347.39933: Calling groups_plugins_play to load vars for managed_node2 18662 1726867347.40779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867347.41966: done with get_vars() 18662 1726867347.41993: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:75 Friday 20 September 2024 17:22:27 -0400 (0:00:00.043) 0:00:42.056 ****** 18662 1726867347.42075: entering _queue_task() for managed_node2/include_tasks 18662 1726867347.42285: worker is 1 (out of 1 available) 18662 1726867347.42298: exiting _queue_task() for managed_node2/include_tasks 18662 1726867347.42313: done queuing things up, now waiting for results queue to drain 18662 1726867347.42314: waiting for pending results... 18662 1726867347.42697: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_absent.yml' 18662 1726867347.42736: in run() - task 0affcac9-a3a5-efab-a8ce-000000000075 18662 1726867347.42747: variable 'ansible_search_path' from source: unknown 18662 1726867347.42780: calling self._execute() 18662 1726867347.42865: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867347.43010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867347.43014: variable 'omit' from source: magic vars 18662 1726867347.43319: variable 'ansible_distribution_major_version' from source: facts 18662 1726867347.43323: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867347.43326: _execute() done 18662 1726867347.43328: dumping result to json 18662 1726867347.43354: done dumping result, returning 18662 1726867347.43358: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_absent.yml' [0affcac9-a3a5-efab-a8ce-000000000075] 18662 1726867347.43361: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000075 18662 1726867347.43542: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000075 18662 1726867347.43607: WORKER PROCESS EXITING 18662 1726867347.43636: no more pending results, returning what we have 18662 1726867347.43640: in VariableManager get_vars() 18662 1726867347.43672: Calling all_inventory to load vars for managed_node2 18662 1726867347.43675: Calling groups_inventory to load vars for managed_node2 18662 1726867347.43679: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867347.43689: Calling all_plugins_play to load vars for managed_node2 18662 1726867347.43692: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867347.43695: Calling groups_plugins_play to load vars for managed_node2 18662 1726867347.45298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867347.46484: done with get_vars() 18662 1726867347.46504: variable 'ansible_search_path' from source: unknown 18662 1726867347.46520: we have included files to process 18662 1726867347.46522: generating all_blocks data 18662 1726867347.46524: done generating all_blocks data 18662 1726867347.46528: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 18662 1726867347.46529: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 18662 1726867347.46530: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 18662 1726867347.46657: in VariableManager get_vars() 18662 1726867347.46666: done with get_vars() 18662 1726867347.46742: done processing included file 18662 1726867347.46744: iterating over new_blocks loaded from include file 18662 1726867347.46745: in VariableManager get_vars() 18662 1726867347.46751: done with get_vars() 18662 1726867347.46752: filtering new block on tags 18662 1726867347.46763: done filtering new block on tags 18662 1726867347.46764: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 18662 1726867347.46767: extending task lists for all hosts with included blocks 18662 1726867347.46855: done extending task lists 18662 1726867347.46856: done processing included files 18662 1726867347.46857: results queue empty 18662 1726867347.46857: checking for any_errors_fatal 18662 1726867347.46860: done checking for any_errors_fatal 18662 1726867347.46860: checking for max_fail_percentage 18662 1726867347.46861: done checking for max_fail_percentage 18662 1726867347.46861: checking to see if all hosts have failed and the running result is not ok 18662 1726867347.46862: done checking to see if all hosts have failed 18662 1726867347.46862: getting the remaining hosts for this loop 18662 1726867347.46863: done getting the remaining hosts for this loop 18662 1726867347.46864: getting the next task for host managed_node2 18662 1726867347.46867: done getting next task for host managed_node2 18662 1726867347.46868: ^ task is: TASK: Include the task 'get_interface_stat.yml' 18662 1726867347.46870: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867347.46871: getting variables 18662 1726867347.46872: in VariableManager get_vars() 18662 1726867347.46879: Calling all_inventory to load vars for managed_node2 18662 1726867347.46880: Calling groups_inventory to load vars for managed_node2 18662 1726867347.46882: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867347.46885: Calling all_plugins_play to load vars for managed_node2 18662 1726867347.46886: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867347.46888: Calling groups_plugins_play to load vars for managed_node2 18662 1726867347.47875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867347.49413: done with get_vars() 18662 1726867347.49432: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 17:22:27 -0400 (0:00:00.074) 0:00:42.130 ****** 18662 1726867347.49500: entering _queue_task() for managed_node2/include_tasks 18662 1726867347.49760: worker is 1 (out of 1 available) 18662 1726867347.49772: exiting _queue_task() for managed_node2/include_tasks 18662 1726867347.49785: done queuing things up, now waiting for results queue to drain 18662 1726867347.49786: waiting for pending results... 18662 1726867347.50196: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 18662 1726867347.50202: in run() - task 0affcac9-a3a5-efab-a8ce-00000000053c 18662 1726867347.50205: variable 'ansible_search_path' from source: unknown 18662 1726867347.50207: variable 'ansible_search_path' from source: unknown 18662 1726867347.50225: calling self._execute() 18662 1726867347.50321: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867347.50382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867347.50386: variable 'omit' from source: magic vars 18662 1726867347.50730: variable 'ansible_distribution_major_version' from source: facts 18662 1726867347.50746: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867347.50759: _execute() done 18662 1726867347.50771: dumping result to json 18662 1726867347.50779: done dumping result, returning 18662 1726867347.50790: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-efab-a8ce-00000000053c] 18662 1726867347.50798: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000053c 18662 1726867347.50939: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000053c 18662 1726867347.50943: WORKER PROCESS EXITING 18662 1726867347.51003: no more pending results, returning what we have 18662 1726867347.51010: in VariableManager get_vars() 18662 1726867347.51044: Calling all_inventory to load vars for managed_node2 18662 1726867347.51047: Calling groups_inventory to load vars for managed_node2 18662 1726867347.51051: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867347.51064: Calling all_plugins_play to load vars for managed_node2 18662 1726867347.51067: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867347.51070: Calling groups_plugins_play to load vars for managed_node2 18662 1726867347.52598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867347.54158: done with get_vars() 18662 1726867347.54184: variable 'ansible_search_path' from source: unknown 18662 1726867347.54185: variable 'ansible_search_path' from source: unknown 18662 1726867347.54226: we have included files to process 18662 1726867347.54227: generating all_blocks data 18662 1726867347.54228: done generating all_blocks data 18662 1726867347.54229: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18662 1726867347.54230: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18662 1726867347.54233: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 18662 1726867347.54418: done processing included file 18662 1726867347.54420: iterating over new_blocks loaded from include file 18662 1726867347.54422: in VariableManager get_vars() 18662 1726867347.54435: done with get_vars() 18662 1726867347.54437: filtering new block on tags 18662 1726867347.54451: done filtering new block on tags 18662 1726867347.54454: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 18662 1726867347.54458: extending task lists for all hosts with included blocks 18662 1726867347.54538: done extending task lists 18662 1726867347.54539: done processing included files 18662 1726867347.54540: results queue empty 18662 1726867347.54540: checking for any_errors_fatal 18662 1726867347.54543: done checking for any_errors_fatal 18662 1726867347.54543: checking for max_fail_percentage 18662 1726867347.54544: done checking for max_fail_percentage 18662 1726867347.54545: checking to see if all hosts have failed and the running result is not ok 18662 1726867347.54545: done checking to see if all hosts have failed 18662 1726867347.54546: getting the remaining hosts for this loop 18662 1726867347.54547: done getting the remaining hosts for this loop 18662 1726867347.54549: getting the next task for host managed_node2 18662 1726867347.54553: done getting next task for host managed_node2 18662 1726867347.54554: ^ task is: TASK: Get stat for interface {{ interface }} 18662 1726867347.54557: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867347.54558: getting variables 18662 1726867347.54559: in VariableManager get_vars() 18662 1726867347.54567: Calling all_inventory to load vars for managed_node2 18662 1726867347.54569: Calling groups_inventory to load vars for managed_node2 18662 1726867347.54572: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867347.54578: Calling all_plugins_play to load vars for managed_node2 18662 1726867347.54581: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867347.54584: Calling groups_plugins_play to load vars for managed_node2 18662 1726867347.55737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867347.57388: done with get_vars() 18662 1726867347.57411: done getting variables 18662 1726867347.57566: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:22:27 -0400 (0:00:00.080) 0:00:42.211 ****** 18662 1726867347.57599: entering _queue_task() for managed_node2/stat 18662 1726867347.57957: worker is 1 (out of 1 available) 18662 1726867347.57969: exiting _queue_task() for managed_node2/stat 18662 1726867347.58182: done queuing things up, now waiting for results queue to drain 18662 1726867347.58184: waiting for pending results... 18662 1726867347.58315: running TaskExecutor() for managed_node2/TASK: Get stat for interface lsr27 18662 1726867347.58412: in run() - task 0affcac9-a3a5-efab-a8ce-000000000554 18662 1726867347.58416: variable 'ansible_search_path' from source: unknown 18662 1726867347.58418: variable 'ansible_search_path' from source: unknown 18662 1726867347.58520: calling self._execute() 18662 1726867347.58571: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867347.58588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867347.58606: variable 'omit' from source: magic vars 18662 1726867347.59026: variable 'ansible_distribution_major_version' from source: facts 18662 1726867347.59044: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867347.59057: variable 'omit' from source: magic vars 18662 1726867347.59124: variable 'omit' from source: magic vars 18662 1726867347.59282: variable 'interface' from source: set_fact 18662 1726867347.59287: variable 'omit' from source: magic vars 18662 1726867347.59311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867347.59359: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867347.59394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867347.59419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867347.59440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867347.59499: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867347.59503: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867347.59506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867347.59615: Set connection var ansible_timeout to 10 18662 1726867347.59624: Set connection var ansible_connection to ssh 18662 1726867347.59652: Set connection var ansible_shell_executable to /bin/sh 18662 1726867347.59656: Set connection var ansible_shell_type to sh 18662 1726867347.59661: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867347.59717: Set connection var ansible_pipelining to False 18662 1726867347.59720: variable 'ansible_shell_executable' from source: unknown 18662 1726867347.59723: variable 'ansible_connection' from source: unknown 18662 1726867347.59725: variable 'ansible_module_compression' from source: unknown 18662 1726867347.59727: variable 'ansible_shell_type' from source: unknown 18662 1726867347.59729: variable 'ansible_shell_executable' from source: unknown 18662 1726867347.59731: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867347.59740: variable 'ansible_pipelining' from source: unknown 18662 1726867347.59747: variable 'ansible_timeout' from source: unknown 18662 1726867347.59760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867347.59988: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 18662 1726867347.60005: variable 'omit' from source: magic vars 18662 1726867347.60043: starting attempt loop 18662 1726867347.60046: running the handler 18662 1726867347.60049: _low_level_execute_command(): starting 18662 1726867347.60059: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867347.60927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867347.60944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867347.60996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867347.61070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867347.61085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867347.61116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867347.61142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867347.61157: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867347.61488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867347.63013: stdout chunk (state=3): >>>/root <<< 18662 1726867347.63265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867347.63269: stdout chunk (state=3): >>><<< 18662 1726867347.63271: stderr chunk (state=3): >>><<< 18662 1726867347.63374: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867347.63380: _low_level_execute_command(): starting 18662 1726867347.63383: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867347.6328135-20653-2569803306979 `" && echo ansible-tmp-1726867347.6328135-20653-2569803306979="` echo /root/.ansible/tmp/ansible-tmp-1726867347.6328135-20653-2569803306979 `" ) && sleep 0' 18662 1726867347.64548: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867347.64780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867347.64792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867347.64795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867347.64886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867347.65039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867347.67058: stdout chunk (state=3): >>>ansible-tmp-1726867347.6328135-20653-2569803306979=/root/.ansible/tmp/ansible-tmp-1726867347.6328135-20653-2569803306979 <<< 18662 1726867347.67223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867347.67226: stdout chunk (state=3): >>><<< 18662 1726867347.67231: stderr chunk (state=3): >>><<< 18662 1726867347.67387: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867347.6328135-20653-2569803306979=/root/.ansible/tmp/ansible-tmp-1726867347.6328135-20653-2569803306979 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867347.67391: variable 'ansible_module_compression' from source: unknown 18662 1726867347.67420: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 18662 1726867347.67465: variable 'ansible_facts' from source: unknown 18662 1726867347.67561: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867347.6328135-20653-2569803306979/AnsiballZ_stat.py 18662 1726867347.67797: Sending initial data 18662 1726867347.67800: Sent initial data (151 bytes) 18662 1726867347.68333: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867347.68342: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867347.68354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867347.68396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867347.68412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867347.68484: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867347.68502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867347.68572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867347.70232: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867347.70293: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867347.6328135-20653-2569803306979/AnsiballZ_stat.py" <<< 18662 1726867347.70296: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpzvv2rawx /root/.ansible/tmp/ansible-tmp-1726867347.6328135-20653-2569803306979/AnsiballZ_stat.py <<< 18662 1726867347.70532: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpzvv2rawx" to remote "/root/.ansible/tmp/ansible-tmp-1726867347.6328135-20653-2569803306979/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867347.6328135-20653-2569803306979/AnsiballZ_stat.py" <<< 18662 1726867347.71660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867347.71812: stdout chunk (state=3): >>><<< 18662 1726867347.71815: stderr chunk (state=3): >>><<< 18662 1726867347.71817: done transferring module to remote 18662 1726867347.71819: _low_level_execute_command(): starting 18662 1726867347.71821: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867347.6328135-20653-2569803306979/ /root/.ansible/tmp/ansible-tmp-1726867347.6328135-20653-2569803306979/AnsiballZ_stat.py && sleep 0' 18662 1726867347.72675: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867347.72693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867347.72705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867347.72728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867347.72742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867347.72775: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867347.72830: stderr chunk (state=3): >>>debug2: match found <<< 18662 1726867347.72859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867347.72908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867347.72925: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867347.72962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867347.73111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867347.75382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867347.75386: stdout chunk (state=3): >>><<< 18662 1726867347.75388: stderr chunk (state=3): >>><<< 18662 1726867347.75390: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867347.75392: _low_level_execute_command(): starting 18662 1726867347.75394: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867347.6328135-20653-2569803306979/AnsiballZ_stat.py && sleep 0' 18662 1726867347.76317: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867347.76321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867347.76323: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867347.76325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867347.76502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867347.76506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867347.76570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867347.76681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867347.92203: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 18662 1726867347.93660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867347.93665: stdout chunk (state=3): >>><<< 18662 1726867347.93667: stderr chunk (state=3): >>><<< 18662 1726867347.93687: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867347.93720: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867347.6328135-20653-2569803306979/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867347.93737: _low_level_execute_command(): starting 18662 1726867347.93745: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867347.6328135-20653-2569803306979/ > /dev/null 2>&1 && sleep 0' 18662 1726867347.94922: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867347.94947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867347.95055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867347.95197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867347.95209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867347.95337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867347.97269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867347.97282: stdout chunk (state=3): >>><<< 18662 1726867347.97294: stderr chunk (state=3): >>><<< 18662 1726867347.97340: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867347.97352: handler run complete 18662 1726867347.97482: attempt loop complete, returning result 18662 1726867347.97486: _execute() done 18662 1726867347.97488: dumping result to json 18662 1726867347.97490: done dumping result, returning 18662 1726867347.97492: done running TaskExecutor() for managed_node2/TASK: Get stat for interface lsr27 [0affcac9-a3a5-efab-a8ce-000000000554] 18662 1726867347.97493: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000554 ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 18662 1726867347.97799: no more pending results, returning what we have 18662 1726867347.97804: results queue empty 18662 1726867347.97805: checking for any_errors_fatal 18662 1726867347.97807: done checking for any_errors_fatal 18662 1726867347.97807: checking for max_fail_percentage 18662 1726867347.97809: done checking for max_fail_percentage 18662 1726867347.97810: checking to see if all hosts have failed and the running result is not ok 18662 1726867347.97811: done checking to see if all hosts have failed 18662 1726867347.97811: getting the remaining hosts for this loop 18662 1726867347.97813: done getting the remaining hosts for this loop 18662 1726867347.97816: getting the next task for host managed_node2 18662 1726867347.97825: done getting next task for host managed_node2 18662 1726867347.97828: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 18662 1726867347.97832: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867347.97837: getting variables 18662 1726867347.97838: in VariableManager get_vars() 18662 1726867347.97869: Calling all_inventory to load vars for managed_node2 18662 1726867347.97872: Calling groups_inventory to load vars for managed_node2 18662 1726867347.97875: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867347.97890: Calling all_plugins_play to load vars for managed_node2 18662 1726867347.97894: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867347.97897: Calling groups_plugins_play to load vars for managed_node2 18662 1726867347.98684: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000554 18662 1726867347.98688: WORKER PROCESS EXITING 18662 1726867348.01074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867348.04243: done with get_vars() 18662 1726867348.04271: done getting variables 18662 1726867348.04332: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 18662 1726867348.04550: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'lsr27'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 17:22:28 -0400 (0:00:00.469) 0:00:42.681 ****** 18662 1726867348.04786: entering _queue_task() for managed_node2/assert 18662 1726867348.05334: worker is 1 (out of 1 available) 18662 1726867348.05345: exiting _queue_task() for managed_node2/assert 18662 1726867348.05356: done queuing things up, now waiting for results queue to drain 18662 1726867348.05358: waiting for pending results... 18662 1726867348.05843: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'lsr27' 18662 1726867348.05998: in run() - task 0affcac9-a3a5-efab-a8ce-00000000053d 18662 1726867348.06023: variable 'ansible_search_path' from source: unknown 18662 1726867348.06032: variable 'ansible_search_path' from source: unknown 18662 1726867348.06072: calling self._execute() 18662 1726867348.06174: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867348.06190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867348.06212: variable 'omit' from source: magic vars 18662 1726867348.06594: variable 'ansible_distribution_major_version' from source: facts 18662 1726867348.06615: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867348.06626: variable 'omit' from source: magic vars 18662 1726867348.06668: variable 'omit' from source: magic vars 18662 1726867348.06776: variable 'interface' from source: set_fact 18662 1726867348.06884: variable 'omit' from source: magic vars 18662 1726867348.06887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867348.06890: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867348.06913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867348.06936: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867348.06954: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867348.06994: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867348.07003: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867348.07014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867348.07122: Set connection var ansible_timeout to 10 18662 1726867348.07131: Set connection var ansible_connection to ssh 18662 1726867348.07142: Set connection var ansible_shell_executable to /bin/sh 18662 1726867348.07148: Set connection var ansible_shell_type to sh 18662 1726867348.07162: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867348.07171: Set connection var ansible_pipelining to False 18662 1726867348.07199: variable 'ansible_shell_executable' from source: unknown 18662 1726867348.07214: variable 'ansible_connection' from source: unknown 18662 1726867348.07220: variable 'ansible_module_compression' from source: unknown 18662 1726867348.07226: variable 'ansible_shell_type' from source: unknown 18662 1726867348.07232: variable 'ansible_shell_executable' from source: unknown 18662 1726867348.07317: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867348.07320: variable 'ansible_pipelining' from source: unknown 18662 1726867348.07323: variable 'ansible_timeout' from source: unknown 18662 1726867348.07325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867348.07400: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867348.07418: variable 'omit' from source: magic vars 18662 1726867348.07435: starting attempt loop 18662 1726867348.07443: running the handler 18662 1726867348.07601: variable 'interface_stat' from source: set_fact 18662 1726867348.07624: Evaluated conditional (not interface_stat.stat.exists): True 18662 1726867348.07634: handler run complete 18662 1726867348.07662: attempt loop complete, returning result 18662 1726867348.07669: _execute() done 18662 1726867348.07679: dumping result to json 18662 1726867348.07687: done dumping result, returning 18662 1726867348.07702: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'lsr27' [0affcac9-a3a5-efab-a8ce-00000000053d] 18662 1726867348.07753: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000053d 18662 1726867348.07827: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000053d 18662 1726867348.07831: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 18662 1726867348.07905: no more pending results, returning what we have 18662 1726867348.07911: results queue empty 18662 1726867348.07912: checking for any_errors_fatal 18662 1726867348.07919: done checking for any_errors_fatal 18662 1726867348.07920: checking for max_fail_percentage 18662 1726867348.07922: done checking for max_fail_percentage 18662 1726867348.07923: checking to see if all hosts have failed and the running result is not ok 18662 1726867348.07923: done checking to see if all hosts have failed 18662 1726867348.07924: getting the remaining hosts for this loop 18662 1726867348.07925: done getting the remaining hosts for this loop 18662 1726867348.07929: getting the next task for host managed_node2 18662 1726867348.07939: done getting next task for host managed_node2 18662 1726867348.07941: ^ task is: TASK: meta (flush_handlers) 18662 1726867348.07942: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867348.07946: getting variables 18662 1726867348.07948: in VariableManager get_vars() 18662 1726867348.07978: Calling all_inventory to load vars for managed_node2 18662 1726867348.07982: Calling groups_inventory to load vars for managed_node2 18662 1726867348.07986: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867348.07997: Calling all_plugins_play to load vars for managed_node2 18662 1726867348.08001: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867348.08004: Calling groups_plugins_play to load vars for managed_node2 18662 1726867348.10063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867348.11669: done with get_vars() 18662 1726867348.11695: done getting variables 18662 1726867348.11760: in VariableManager get_vars() 18662 1726867348.11769: Calling all_inventory to load vars for managed_node2 18662 1726867348.11771: Calling groups_inventory to load vars for managed_node2 18662 1726867348.11774: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867348.11780: Calling all_plugins_play to load vars for managed_node2 18662 1726867348.11782: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867348.11785: Calling groups_plugins_play to load vars for managed_node2 18662 1726867348.13044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867348.14693: done with get_vars() 18662 1726867348.14717: done queuing things up, now waiting for results queue to drain 18662 1726867348.14719: results queue empty 18662 1726867348.14720: checking for any_errors_fatal 18662 1726867348.14723: done checking for any_errors_fatal 18662 1726867348.14724: checking for max_fail_percentage 18662 1726867348.14725: done checking for max_fail_percentage 18662 1726867348.14726: checking to see if all hosts have failed and the running result is not ok 18662 1726867348.14726: done checking to see if all hosts have failed 18662 1726867348.14732: getting the remaining hosts for this loop 18662 1726867348.14733: done getting the remaining hosts for this loop 18662 1726867348.14736: getting the next task for host managed_node2 18662 1726867348.14739: done getting next task for host managed_node2 18662 1726867348.14741: ^ task is: TASK: meta (flush_handlers) 18662 1726867348.14742: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867348.14745: getting variables 18662 1726867348.14746: in VariableManager get_vars() 18662 1726867348.14754: Calling all_inventory to load vars for managed_node2 18662 1726867348.14756: Calling groups_inventory to load vars for managed_node2 18662 1726867348.14758: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867348.14763: Calling all_plugins_play to load vars for managed_node2 18662 1726867348.14765: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867348.14768: Calling groups_plugins_play to load vars for managed_node2 18662 1726867348.15924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867348.17640: done with get_vars() 18662 1726867348.17661: done getting variables 18662 1726867348.17919: in VariableManager get_vars() 18662 1726867348.17928: Calling all_inventory to load vars for managed_node2 18662 1726867348.17930: Calling groups_inventory to load vars for managed_node2 18662 1726867348.17932: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867348.17937: Calling all_plugins_play to load vars for managed_node2 18662 1726867348.17939: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867348.17942: Calling groups_plugins_play to load vars for managed_node2 18662 1726867348.19243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867348.20907: done with get_vars() 18662 1726867348.20930: done queuing things up, now waiting for results queue to drain 18662 1726867348.20931: results queue empty 18662 1726867348.20932: checking for any_errors_fatal 18662 1726867348.20933: done checking for any_errors_fatal 18662 1726867348.20934: checking for max_fail_percentage 18662 1726867348.20935: done checking for max_fail_percentage 18662 1726867348.20935: checking to see if all hosts have failed and the running result is not ok 18662 1726867348.20936: done checking to see if all hosts have failed 18662 1726867348.20936: getting the remaining hosts for this loop 18662 1726867348.20937: done getting the remaining hosts for this loop 18662 1726867348.20939: getting the next task for host managed_node2 18662 1726867348.20942: done getting next task for host managed_node2 18662 1726867348.20943: ^ task is: None 18662 1726867348.20944: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867348.20945: done queuing things up, now waiting for results queue to drain 18662 1726867348.20946: results queue empty 18662 1726867348.20946: checking for any_errors_fatal 18662 1726867348.20947: done checking for any_errors_fatal 18662 1726867348.20947: checking for max_fail_percentage 18662 1726867348.20948: done checking for max_fail_percentage 18662 1726867348.20948: checking to see if all hosts have failed and the running result is not ok 18662 1726867348.20949: done checking to see if all hosts have failed 18662 1726867348.20950: getting the next task for host managed_node2 18662 1726867348.20952: done getting next task for host managed_node2 18662 1726867348.20952: ^ task is: None 18662 1726867348.20953: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867348.20988: in VariableManager get_vars() 18662 1726867348.21001: done with get_vars() 18662 1726867348.21006: in VariableManager get_vars() 18662 1726867348.21013: done with get_vars() 18662 1726867348.21017: variable 'omit' from source: magic vars 18662 1726867348.21045: in VariableManager get_vars() 18662 1726867348.21055: done with get_vars() 18662 1726867348.21074: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 18662 1726867348.21246: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 18662 1726867348.21266: getting the remaining hosts for this loop 18662 1726867348.21267: done getting the remaining hosts for this loop 18662 1726867348.21269: getting the next task for host managed_node2 18662 1726867348.21271: done getting next task for host managed_node2 18662 1726867348.21273: ^ task is: TASK: Gathering Facts 18662 1726867348.21274: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867348.21276: getting variables 18662 1726867348.21278: in VariableManager get_vars() 18662 1726867348.21285: Calling all_inventory to load vars for managed_node2 18662 1726867348.21287: Calling groups_inventory to load vars for managed_node2 18662 1726867348.21289: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867348.21294: Calling all_plugins_play to load vars for managed_node2 18662 1726867348.21296: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867348.21298: Calling groups_plugins_play to load vars for managed_node2 18662 1726867348.22604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867348.25137: done with get_vars() 18662 1726867348.25158: done getting variables 18662 1726867348.25202: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Friday 20 September 2024 17:22:28 -0400 (0:00:00.206) 0:00:42.887 ****** 18662 1726867348.25226: entering _queue_task() for managed_node2/gather_facts 18662 1726867348.25567: worker is 1 (out of 1 available) 18662 1726867348.25581: exiting _queue_task() for managed_node2/gather_facts 18662 1726867348.25785: done queuing things up, now waiting for results queue to drain 18662 1726867348.25787: waiting for pending results... 18662 1726867348.25874: running TaskExecutor() for managed_node2/TASK: Gathering Facts 18662 1726867348.26184: in run() - task 0affcac9-a3a5-efab-a8ce-00000000056d 18662 1726867348.26188: variable 'ansible_search_path' from source: unknown 18662 1726867348.26191: calling self._execute() 18662 1726867348.26194: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867348.26196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867348.26198: variable 'omit' from source: magic vars 18662 1726867348.26558: variable 'ansible_distribution_major_version' from source: facts 18662 1726867348.26576: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867348.26590: variable 'omit' from source: magic vars 18662 1726867348.26623: variable 'omit' from source: magic vars 18662 1726867348.26671: variable 'omit' from source: magic vars 18662 1726867348.26721: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867348.26766: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867348.26795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867348.26819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867348.26836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867348.26879: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867348.26888: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867348.26897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867348.27022: Set connection var ansible_timeout to 10 18662 1726867348.27030: Set connection var ansible_connection to ssh 18662 1726867348.27040: Set connection var ansible_shell_executable to /bin/sh 18662 1726867348.27046: Set connection var ansible_shell_type to sh 18662 1726867348.27059: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867348.27070: Set connection var ansible_pipelining to False 18662 1726867348.27103: variable 'ansible_shell_executable' from source: unknown 18662 1726867348.27184: variable 'ansible_connection' from source: unknown 18662 1726867348.27187: variable 'ansible_module_compression' from source: unknown 18662 1726867348.27190: variable 'ansible_shell_type' from source: unknown 18662 1726867348.27192: variable 'ansible_shell_executable' from source: unknown 18662 1726867348.27194: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867348.27195: variable 'ansible_pipelining' from source: unknown 18662 1726867348.27198: variable 'ansible_timeout' from source: unknown 18662 1726867348.27200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867348.27327: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867348.27342: variable 'omit' from source: magic vars 18662 1726867348.27352: starting attempt loop 18662 1726867348.27358: running the handler 18662 1726867348.27378: variable 'ansible_facts' from source: unknown 18662 1726867348.27405: _low_level_execute_command(): starting 18662 1726867348.27418: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867348.28264: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867348.28300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867348.28355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867348.28431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867348.28466: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867348.28572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867348.30205: stdout chunk (state=3): >>>/root <<< 18662 1726867348.30457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867348.30460: stdout chunk (state=3): >>><<< 18662 1726867348.30462: stderr chunk (state=3): >>><<< 18662 1726867348.30531: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867348.30579: _low_level_execute_command(): starting 18662 1726867348.30584: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867348.3054876-20687-192786711103536 `" && echo ansible-tmp-1726867348.3054876-20687-192786711103536="` echo /root/.ansible/tmp/ansible-tmp-1726867348.3054876-20687-192786711103536 `" ) && sleep 0' 18662 1726867348.31475: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867348.31480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867348.31483: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867348.31554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867348.31596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867348.33845: stdout chunk (state=3): >>>ansible-tmp-1726867348.3054876-20687-192786711103536=/root/.ansible/tmp/ansible-tmp-1726867348.3054876-20687-192786711103536 <<< 18662 1726867348.33849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867348.33851: stdout chunk (state=3): >>><<< 18662 1726867348.33853: stderr chunk (state=3): >>><<< 18662 1726867348.33855: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867348.3054876-20687-192786711103536=/root/.ansible/tmp/ansible-tmp-1726867348.3054876-20687-192786711103536 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867348.33871: variable 'ansible_module_compression' from source: unknown 18662 1726867348.34025: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 18662 1726867348.34057: variable 'ansible_facts' from source: unknown 18662 1726867348.34305: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867348.3054876-20687-192786711103536/AnsiballZ_setup.py 18662 1726867348.34553: Sending initial data 18662 1726867348.34580: Sent initial data (154 bytes) 18662 1726867348.35573: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867348.35673: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867348.35717: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867348.35741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867348.35766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867348.35836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867348.37445: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867348.37502: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867348.37563: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmpv9_gce6i /root/.ansible/tmp/ansible-tmp-1726867348.3054876-20687-192786711103536/AnsiballZ_setup.py <<< 18662 1726867348.37588: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867348.3054876-20687-192786711103536/AnsiballZ_setup.py" <<< 18662 1726867348.37612: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmpv9_gce6i" to remote "/root/.ansible/tmp/ansible-tmp-1726867348.3054876-20687-192786711103536/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867348.3054876-20687-192786711103536/AnsiballZ_setup.py" <<< 18662 1726867348.39823: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867348.39826: stdout chunk (state=3): >>><<< 18662 1726867348.39828: stderr chunk (state=3): >>><<< 18662 1726867348.39830: done transferring module to remote 18662 1726867348.39832: _low_level_execute_command(): starting 18662 1726867348.39834: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867348.3054876-20687-192786711103536/ /root/.ansible/tmp/ansible-tmp-1726867348.3054876-20687-192786711103536/AnsiballZ_setup.py && sleep 0' 18662 1726867348.40390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867348.40402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867348.40421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867348.40439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867348.40455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867348.40466: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867348.40482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867348.40500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 18662 1726867348.40514: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 18662 1726867348.40593: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867348.40603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867348.40625: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867348.40650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867348.40718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867348.42690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867348.42699: stdout chunk (state=3): >>><<< 18662 1726867348.42718: stderr chunk (state=3): >>><<< 18662 1726867348.42743: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867348.42762: _low_level_execute_command(): starting 18662 1726867348.42782: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867348.3054876-20687-192786711103536/AnsiballZ_setup.py && sleep 0' 18662 1726867348.44062: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867348.44066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867348.44069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867348.44082: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867348.44153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867348.44156: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867348.44217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867349.07799: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "22", "second": "28", "epoch": "1726867348", "epoch_int": "1726867348", "date": "2024-09-20", "time": "17:22:28", "iso8601_micro": "2024-09-20T21:22:28.727438Z", "iso8601": "2024-09-20T21:22:28Z", "iso8601_basic": "20240920T172228727438", "iso8601_basic_short": "20240920T172228", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2953, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 578, "free": 2953}, "nocache": {"free": 3291, "used": 240}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 586, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794910208, "block_size": 4096, "block_total": 65519099, "block_available": 63914773, "block_used": 1604326, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.31787109375, "5m": 0.36572265625, "15m": 0.20068359375}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 18662 1726867349.09849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867349.09876: stderr chunk (state=3): >>><<< 18662 1726867349.09888: stdout chunk (state=3): >>><<< 18662 1726867349.09932: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "22", "second": "28", "epoch": "1726867348", "epoch_int": "1726867348", "date": "2024-09-20", "time": "17:22:28", "iso8601_micro": "2024-09-20T21:22:28.727438Z", "iso8601": "2024-09-20T21:22:28Z", "iso8601_basic": "20240920T172228727438", "iso8601_basic_short": "20240920T172228", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2953, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 578, "free": 2953}, "nocache": {"free": 3291, "used": 240}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 586, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794910208, "block_size": 4096, "block_total": 65519099, "block_available": 63914773, "block_used": 1604326, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.31787109375, "5m": 0.36572265625, "15m": 0.20068359375}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867349.10308: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867348.3054876-20687-192786711103536/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867349.10336: _low_level_execute_command(): starting 18662 1726867349.10345: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867348.3054876-20687-192786711103536/ > /dev/null 2>&1 && sleep 0' 18662 1726867349.10979: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867349.10994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867349.11008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867349.11026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867349.11054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867349.11057: stderr chunk (state=3): >>>debug2: match not found <<< 18662 1726867349.11154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867349.11167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867349.11196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867349.11373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867349.13161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867349.13164: stdout chunk (state=3): >>><<< 18662 1726867349.13166: stderr chunk (state=3): >>><<< 18662 1726867349.13387: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867349.13391: handler run complete 18662 1726867349.13393: variable 'ansible_facts' from source: unknown 18662 1726867349.13446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867349.13787: variable 'ansible_facts' from source: unknown 18662 1726867349.13884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867349.14024: attempt loop complete, returning result 18662 1726867349.14042: _execute() done 18662 1726867349.14051: dumping result to json 18662 1726867349.14091: done dumping result, returning 18662 1726867349.14103: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-efab-a8ce-00000000056d] 18662 1726867349.14111: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000056d 18662 1726867349.14583: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000056d 18662 1726867349.14586: WORKER PROCESS EXITING ok: [managed_node2] 18662 1726867349.14964: no more pending results, returning what we have 18662 1726867349.14968: results queue empty 18662 1726867349.14969: checking for any_errors_fatal 18662 1726867349.14970: done checking for any_errors_fatal 18662 1726867349.14971: checking for max_fail_percentage 18662 1726867349.14973: done checking for max_fail_percentage 18662 1726867349.14974: checking to see if all hosts have failed and the running result is not ok 18662 1726867349.14974: done checking to see if all hosts have failed 18662 1726867349.14975: getting the remaining hosts for this loop 18662 1726867349.14976: done getting the remaining hosts for this loop 18662 1726867349.14982: getting the next task for host managed_node2 18662 1726867349.14988: done getting next task for host managed_node2 18662 1726867349.14990: ^ task is: TASK: meta (flush_handlers) 18662 1726867349.14992: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867349.15081: getting variables 18662 1726867349.15084: in VariableManager get_vars() 18662 1726867349.15113: Calling all_inventory to load vars for managed_node2 18662 1726867349.15116: Calling groups_inventory to load vars for managed_node2 18662 1726867349.15120: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867349.15131: Calling all_plugins_play to load vars for managed_node2 18662 1726867349.15134: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867349.15137: Calling groups_plugins_play to load vars for managed_node2 18662 1726867349.20806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867349.23807: done with get_vars() 18662 1726867349.23830: done getting variables 18662 1726867349.24020: in VariableManager get_vars() 18662 1726867349.24037: Calling all_inventory to load vars for managed_node2 18662 1726867349.24041: Calling groups_inventory to load vars for managed_node2 18662 1726867349.24044: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867349.24052: Calling all_plugins_play to load vars for managed_node2 18662 1726867349.24057: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867349.24061: Calling groups_plugins_play to load vars for managed_node2 18662 1726867349.26716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867349.30003: done with get_vars() 18662 1726867349.30028: done queuing things up, now waiting for results queue to drain 18662 1726867349.30030: results queue empty 18662 1726867349.30031: checking for any_errors_fatal 18662 1726867349.30039: done checking for any_errors_fatal 18662 1726867349.30040: checking for max_fail_percentage 18662 1726867349.30041: done checking for max_fail_percentage 18662 1726867349.30042: checking to see if all hosts have failed and the running result is not ok 18662 1726867349.30043: done checking to see if all hosts have failed 18662 1726867349.30044: getting the remaining hosts for this loop 18662 1726867349.30045: done getting the remaining hosts for this loop 18662 1726867349.30047: getting the next task for host managed_node2 18662 1726867349.30051: done getting next task for host managed_node2 18662 1726867349.30055: ^ task is: TASK: Verify network state restored to default 18662 1726867349.30056: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867349.30058: getting variables 18662 1726867349.30059: in VariableManager get_vars() 18662 1726867349.30068: Calling all_inventory to load vars for managed_node2 18662 1726867349.30070: Calling groups_inventory to load vars for managed_node2 18662 1726867349.30073: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867349.30282: Calling all_plugins_play to load vars for managed_node2 18662 1726867349.30285: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867349.30289: Calling groups_plugins_play to load vars for managed_node2 18662 1726867349.32383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867349.35315: done with get_vars() 18662 1726867349.35337: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:80 Friday 20 September 2024 17:22:29 -0400 (0:00:01.101) 0:00:43.989 ****** 18662 1726867349.35415: entering _queue_task() for managed_node2/include_tasks 18662 1726867349.35814: worker is 1 (out of 1 available) 18662 1726867349.35836: exiting _queue_task() for managed_node2/include_tasks 18662 1726867349.35856: done queuing things up, now waiting for results queue to drain 18662 1726867349.35858: waiting for pending results... 18662 1726867349.36424: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 18662 1726867349.36557: in run() - task 0affcac9-a3a5-efab-a8ce-000000000078 18662 1726867349.36560: variable 'ansible_search_path' from source: unknown 18662 1726867349.36564: calling self._execute() 18662 1726867349.36882: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867349.36888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867349.36891: variable 'omit' from source: magic vars 18662 1726867349.37938: variable 'ansible_distribution_major_version' from source: facts 18662 1726867349.37941: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867349.37944: _execute() done 18662 1726867349.37947: dumping result to json 18662 1726867349.37949: done dumping result, returning 18662 1726867349.37952: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [0affcac9-a3a5-efab-a8ce-000000000078] 18662 1726867349.37954: sending task result for task 0affcac9-a3a5-efab-a8ce-000000000078 18662 1726867349.38057: no more pending results, returning what we have 18662 1726867349.38063: in VariableManager get_vars() 18662 1726867349.38099: Calling all_inventory to load vars for managed_node2 18662 1726867349.38102: Calling groups_inventory to load vars for managed_node2 18662 1726867349.38106: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867349.38123: Calling all_plugins_play to load vars for managed_node2 18662 1726867349.38127: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867349.38130: Calling groups_plugins_play to load vars for managed_node2 18662 1726867349.38808: done sending task result for task 0affcac9-a3a5-efab-a8ce-000000000078 18662 1726867349.38814: WORKER PROCESS EXITING 18662 1726867349.41337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867349.44108: done with get_vars() 18662 1726867349.44130: variable 'ansible_search_path' from source: unknown 18662 1726867349.44145: we have included files to process 18662 1726867349.44146: generating all_blocks data 18662 1726867349.44147: done generating all_blocks data 18662 1726867349.44148: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 18662 1726867349.44149: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 18662 1726867349.44151: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 18662 1726867349.44550: done processing included file 18662 1726867349.44552: iterating over new_blocks loaded from include file 18662 1726867349.44553: in VariableManager get_vars() 18662 1726867349.44565: done with get_vars() 18662 1726867349.44566: filtering new block on tags 18662 1726867349.44584: done filtering new block on tags 18662 1726867349.44586: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 18662 1726867349.44591: extending task lists for all hosts with included blocks 18662 1726867349.44623: done extending task lists 18662 1726867349.44624: done processing included files 18662 1726867349.44625: results queue empty 18662 1726867349.44625: checking for any_errors_fatal 18662 1726867349.44627: done checking for any_errors_fatal 18662 1726867349.44628: checking for max_fail_percentage 18662 1726867349.44629: done checking for max_fail_percentage 18662 1726867349.44629: checking to see if all hosts have failed and the running result is not ok 18662 1726867349.44630: done checking to see if all hosts have failed 18662 1726867349.44631: getting the remaining hosts for this loop 18662 1726867349.44632: done getting the remaining hosts for this loop 18662 1726867349.44634: getting the next task for host managed_node2 18662 1726867349.44638: done getting next task for host managed_node2 18662 1726867349.44640: ^ task is: TASK: Check routes and DNS 18662 1726867349.44641: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867349.44643: getting variables 18662 1726867349.44644: in VariableManager get_vars() 18662 1726867349.44651: Calling all_inventory to load vars for managed_node2 18662 1726867349.44653: Calling groups_inventory to load vars for managed_node2 18662 1726867349.44656: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867349.44661: Calling all_plugins_play to load vars for managed_node2 18662 1726867349.44663: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867349.44666: Calling groups_plugins_play to load vars for managed_node2 18662 1726867349.45860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867349.47474: done with get_vars() 18662 1726867349.47495: done getting variables 18662 1726867349.47536: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 17:22:29 -0400 (0:00:00.121) 0:00:44.111 ****** 18662 1726867349.47563: entering _queue_task() for managed_node2/shell 18662 1726867349.47899: worker is 1 (out of 1 available) 18662 1726867349.47912: exiting _queue_task() for managed_node2/shell 18662 1726867349.47922: done queuing things up, now waiting for results queue to drain 18662 1726867349.47924: waiting for pending results... 18662 1726867349.48192: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 18662 1726867349.48322: in run() - task 0affcac9-a3a5-efab-a8ce-00000000057e 18662 1726867349.48343: variable 'ansible_search_path' from source: unknown 18662 1726867349.48351: variable 'ansible_search_path' from source: unknown 18662 1726867349.48392: calling self._execute() 18662 1726867349.48488: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867349.48500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867349.48516: variable 'omit' from source: magic vars 18662 1726867349.48897: variable 'ansible_distribution_major_version' from source: facts 18662 1726867349.48917: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867349.48927: variable 'omit' from source: magic vars 18662 1726867349.48978: variable 'omit' from source: magic vars 18662 1726867349.49072: variable 'omit' from source: magic vars 18662 1726867349.49075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867349.49112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867349.49141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867349.49162: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867349.49183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867349.49219: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867349.49227: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867349.49234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867349.49343: Set connection var ansible_timeout to 10 18662 1726867349.49398: Set connection var ansible_connection to ssh 18662 1726867349.49401: Set connection var ansible_shell_executable to /bin/sh 18662 1726867349.49403: Set connection var ansible_shell_type to sh 18662 1726867349.49405: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867349.49407: Set connection var ansible_pipelining to False 18662 1726867349.49417: variable 'ansible_shell_executable' from source: unknown 18662 1726867349.49425: variable 'ansible_connection' from source: unknown 18662 1726867349.49431: variable 'ansible_module_compression' from source: unknown 18662 1726867349.49437: variable 'ansible_shell_type' from source: unknown 18662 1726867349.49443: variable 'ansible_shell_executable' from source: unknown 18662 1726867349.49451: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867349.49459: variable 'ansible_pipelining' from source: unknown 18662 1726867349.49506: variable 'ansible_timeout' from source: unknown 18662 1726867349.49512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867349.49618: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867349.49637: variable 'omit' from source: magic vars 18662 1726867349.49646: starting attempt loop 18662 1726867349.49653: running the handler 18662 1726867349.49666: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867349.49689: _low_level_execute_command(): starting 18662 1726867349.49699: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867349.50486: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867349.50505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867349.50590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867349.50898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867349.50924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867349.52607: stdout chunk (state=3): >>>/root <<< 18662 1726867349.52767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867349.52770: stdout chunk (state=3): >>><<< 18662 1726867349.52772: stderr chunk (state=3): >>><<< 18662 1726867349.52999: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867349.53003: _low_level_execute_command(): starting 18662 1726867349.53006: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867349.5289874-20738-66063629290932 `" && echo ansible-tmp-1726867349.5289874-20738-66063629290932="` echo /root/.ansible/tmp/ansible-tmp-1726867349.5289874-20738-66063629290932 `" ) && sleep 0' 18662 1726867349.53987: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867349.54194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867349.54206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867349.54220: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867349.54310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867349.54362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867349.56379: stdout chunk (state=3): >>>ansible-tmp-1726867349.5289874-20738-66063629290932=/root/.ansible/tmp/ansible-tmp-1726867349.5289874-20738-66063629290932 <<< 18662 1726867349.56518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867349.56521: stdout chunk (state=3): >>><<< 18662 1726867349.56528: stderr chunk (state=3): >>><<< 18662 1726867349.56693: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867349.5289874-20738-66063629290932=/root/.ansible/tmp/ansible-tmp-1726867349.5289874-20738-66063629290932 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867349.56696: variable 'ansible_module_compression' from source: unknown 18662 1726867349.56699: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18662 1726867349.56701: variable 'ansible_facts' from source: unknown 18662 1726867349.56749: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867349.5289874-20738-66063629290932/AnsiballZ_command.py 18662 1726867349.56902: Sending initial data 18662 1726867349.56905: Sent initial data (155 bytes) 18662 1726867349.58169: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867349.58282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867349.58456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867349.60074: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 18662 1726867349.60082: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 18662 1726867349.60091: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 18662 1726867349.60095: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867349.60142: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867349.60182: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmp9gqrhdfz /root/.ansible/tmp/ansible-tmp-1726867349.5289874-20738-66063629290932/AnsiballZ_command.py <<< 18662 1726867349.60189: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867349.5289874-20738-66063629290932/AnsiballZ_command.py" <<< 18662 1726867349.60220: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmp9gqrhdfz" to remote "/root/.ansible/tmp/ansible-tmp-1726867349.5289874-20738-66063629290932/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867349.5289874-20738-66063629290932/AnsiballZ_command.py" <<< 18662 1726867349.60882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867349.60900: stderr chunk (state=3): >>><<< 18662 1726867349.60903: stdout chunk (state=3): >>><<< 18662 1726867349.61075: done transferring module to remote 18662 1726867349.61080: _low_level_execute_command(): starting 18662 1726867349.61083: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867349.5289874-20738-66063629290932/ /root/.ansible/tmp/ansible-tmp-1726867349.5289874-20738-66063629290932/AnsiballZ_command.py && sleep 0' 18662 1726867349.61970: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867349.61973: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867349.61976: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867349.62000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867349.62027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867349.62118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867349.64049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867349.64072: stderr chunk (state=3): >>><<< 18662 1726867349.64075: stdout chunk (state=3): >>><<< 18662 1726867349.64097: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867349.64174: _low_level_execute_command(): starting 18662 1726867349.64180: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867349.5289874-20738-66063629290932/AnsiballZ_command.py && sleep 0' 18662 1726867349.64786: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 18662 1726867349.64801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867349.64821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867349.64845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867349.64864: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867349.64904: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 18662 1726867349.64907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867349.65005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867349.65021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867349.65036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867349.65064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867349.65191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867349.81633: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:d5:c3:77:ad brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.12.116/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3297sec preferred_lft 3297sec\n inet6 fe80::8ff:d5ff:fec3:77ad/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.116 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.116 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 17:22:29.806158", "end": "2024-09-20 17:22:29.815035", "delta": "0:00:00.008877", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18662 1726867349.83218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867349.83239: stderr chunk (state=3): >>><<< 18662 1726867349.83242: stdout chunk (state=3): >>><<< 18662 1726867349.83260: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:d5:c3:77:ad brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.12.116/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3297sec preferred_lft 3297sec\n inet6 fe80::8ff:d5ff:fec3:77ad/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.116 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.116 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 17:22:29.806158", "end": "2024-09-20 17:22:29.815035", "delta": "0:00:00.008877", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867349.83296: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867349.5289874-20738-66063629290932/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867349.83303: _low_level_execute_command(): starting 18662 1726867349.83310: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867349.5289874-20738-66063629290932/ > /dev/null 2>&1 && sleep 0' 18662 1726867349.83746: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867349.83751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867349.83754: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867349.83756: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867349.83759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 18662 1726867349.83761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867349.83812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867349.83815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867349.83820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867349.83860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867349.85733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867349.85754: stderr chunk (state=3): >>><<< 18662 1726867349.85757: stdout chunk (state=3): >>><<< 18662 1726867349.85768: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867349.85774: handler run complete 18662 1726867349.85794: Evaluated conditional (False): False 18662 1726867349.85804: attempt loop complete, returning result 18662 1726867349.85809: _execute() done 18662 1726867349.85811: dumping result to json 18662 1726867349.85817: done dumping result, returning 18662 1726867349.85825: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [0affcac9-a3a5-efab-a8ce-00000000057e] 18662 1726867349.85829: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000057e 18662 1726867349.85925: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000057e 18662 1726867349.85928: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008877", "end": "2024-09-20 17:22:29.815035", "rc": 0, "start": "2024-09-20 17:22:29.806158" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:d5:c3:77:ad brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.12.116/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3297sec preferred_lft 3297sec inet6 fe80::8ff:d5ff:fec3:77ad/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.116 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.116 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 18662 1726867349.85995: no more pending results, returning what we have 18662 1726867349.85999: results queue empty 18662 1726867349.85999: checking for any_errors_fatal 18662 1726867349.86001: done checking for any_errors_fatal 18662 1726867349.86001: checking for max_fail_percentage 18662 1726867349.86003: done checking for max_fail_percentage 18662 1726867349.86004: checking to see if all hosts have failed and the running result is not ok 18662 1726867349.86004: done checking to see if all hosts have failed 18662 1726867349.86005: getting the remaining hosts for this loop 18662 1726867349.86006: done getting the remaining hosts for this loop 18662 1726867349.86010: getting the next task for host managed_node2 18662 1726867349.86017: done getting next task for host managed_node2 18662 1726867349.86019: ^ task is: TASK: Verify DNS and network connectivity 18662 1726867349.86021: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867349.86025: getting variables 18662 1726867349.86031: in VariableManager get_vars() 18662 1726867349.86067: Calling all_inventory to load vars for managed_node2 18662 1726867349.86070: Calling groups_inventory to load vars for managed_node2 18662 1726867349.86073: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867349.86085: Calling all_plugins_play to load vars for managed_node2 18662 1726867349.86088: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867349.86091: Calling groups_plugins_play to load vars for managed_node2 18662 1726867349.86985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867349.87868: done with get_vars() 18662 1726867349.87890: done getting variables 18662 1726867349.87933: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 17:22:29 -0400 (0:00:00.403) 0:00:44.515 ****** 18662 1726867349.87956: entering _queue_task() for managed_node2/shell 18662 1726867349.88182: worker is 1 (out of 1 available) 18662 1726867349.88193: exiting _queue_task() for managed_node2/shell 18662 1726867349.88203: done queuing things up, now waiting for results queue to drain 18662 1726867349.88204: waiting for pending results... 18662 1726867349.88369: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 18662 1726867349.88449: in run() - task 0affcac9-a3a5-efab-a8ce-00000000057f 18662 1726867349.88462: variable 'ansible_search_path' from source: unknown 18662 1726867349.88466: variable 'ansible_search_path' from source: unknown 18662 1726867349.88494: calling self._execute() 18662 1726867349.88573: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867349.88578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867349.88588: variable 'omit' from source: magic vars 18662 1726867349.88862: variable 'ansible_distribution_major_version' from source: facts 18662 1726867349.88872: Evaluated conditional (ansible_distribution_major_version != '6'): True 18662 1726867349.88967: variable 'ansible_facts' from source: unknown 18662 1726867349.89665: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 18662 1726867349.89886: variable 'omit' from source: magic vars 18662 1726867349.89889: variable 'omit' from source: magic vars 18662 1726867349.89892: variable 'omit' from source: magic vars 18662 1726867349.89894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 18662 1726867349.89897: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 18662 1726867349.89899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 18662 1726867349.89901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867349.89906: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 18662 1726867349.89942: variable 'inventory_hostname' from source: host vars for 'managed_node2' 18662 1726867349.89951: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867349.89966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867349.90039: Set connection var ansible_timeout to 10 18662 1726867349.90042: Set connection var ansible_connection to ssh 18662 1726867349.90044: Set connection var ansible_shell_executable to /bin/sh 18662 1726867349.90047: Set connection var ansible_shell_type to sh 18662 1726867349.90056: Set connection var ansible_module_compression to ZIP_DEFLATED 18662 1726867349.90061: Set connection var ansible_pipelining to False 18662 1726867349.90087: variable 'ansible_shell_executable' from source: unknown 18662 1726867349.90091: variable 'ansible_connection' from source: unknown 18662 1726867349.90094: variable 'ansible_module_compression' from source: unknown 18662 1726867349.90106: variable 'ansible_shell_type' from source: unknown 18662 1726867349.90112: variable 'ansible_shell_executable' from source: unknown 18662 1726867349.90114: variable 'ansible_host' from source: host vars for 'managed_node2' 18662 1726867349.90116: variable 'ansible_pipelining' from source: unknown 18662 1726867349.90118: variable 'ansible_timeout' from source: unknown 18662 1726867349.90121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 18662 1726867349.90226: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867349.90234: variable 'omit' from source: magic vars 18662 1726867349.90239: starting attempt loop 18662 1726867349.90242: running the handler 18662 1726867349.90251: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 18662 1726867349.90266: _low_level_execute_command(): starting 18662 1726867349.90273: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 18662 1726867349.90763: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867349.90767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867349.90771: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867349.90825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867349.90830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867349.90881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867349.92547: stdout chunk (state=3): >>>/root <<< 18662 1726867349.92648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867349.92670: stderr chunk (state=3): >>><<< 18662 1726867349.92673: stdout chunk (state=3): >>><<< 18662 1726867349.92694: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867349.92705: _low_level_execute_command(): starting 18662 1726867349.92713: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867349.9269345-20769-199757927690799 `" && echo ansible-tmp-1726867349.9269345-20769-199757927690799="` echo /root/.ansible/tmp/ansible-tmp-1726867349.9269345-20769-199757927690799 `" ) && sleep 0' 18662 1726867349.93361: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867349.93399: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867349.95346: stdout chunk (state=3): >>>ansible-tmp-1726867349.9269345-20769-199757927690799=/root/.ansible/tmp/ansible-tmp-1726867349.9269345-20769-199757927690799 <<< 18662 1726867349.95452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867349.95479: stderr chunk (state=3): >>><<< 18662 1726867349.95483: stdout chunk (state=3): >>><<< 18662 1726867349.95496: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867349.9269345-20769-199757927690799=/root/.ansible/tmp/ansible-tmp-1726867349.9269345-20769-199757927690799 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867349.95523: variable 'ansible_module_compression' from source: unknown 18662 1726867349.95561: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-186628rjisbxe/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 18662 1726867349.95598: variable 'ansible_facts' from source: unknown 18662 1726867349.95648: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867349.9269345-20769-199757927690799/AnsiballZ_command.py 18662 1726867349.95741: Sending initial data 18662 1726867349.95745: Sent initial data (156 bytes) 18662 1726867349.96162: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867349.96165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867349.96168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 18662 1726867349.96170: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867349.96172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867349.96216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867349.96235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867349.96270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867349.97870: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 18662 1726867349.97874: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 18662 1726867349.97911: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 18662 1726867349.97957: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-186628rjisbxe/tmp_9_atefn /root/.ansible/tmp/ansible-tmp-1726867349.9269345-20769-199757927690799/AnsiballZ_command.py <<< 18662 1726867349.97960: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867349.9269345-20769-199757927690799/AnsiballZ_command.py" <<< 18662 1726867349.97989: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-186628rjisbxe/tmp_9_atefn" to remote "/root/.ansible/tmp/ansible-tmp-1726867349.9269345-20769-199757927690799/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867349.9269345-20769-199757927690799/AnsiballZ_command.py" <<< 18662 1726867349.98514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867349.98543: stderr chunk (state=3): >>><<< 18662 1726867349.98547: stdout chunk (state=3): >>><<< 18662 1726867349.98590: done transferring module to remote 18662 1726867349.98601: _low_level_execute_command(): starting 18662 1726867349.98607: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867349.9269345-20769-199757927690799/ /root/.ansible/tmp/ansible-tmp-1726867349.9269345-20769-199757927690799/AnsiballZ_command.py && sleep 0' 18662 1726867349.99004: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867349.99007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867349.99009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 18662 1726867349.99012: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867349.99014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867349.99063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867349.99066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867349.99110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867350.00900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867350.00923: stderr chunk (state=3): >>><<< 18662 1726867350.00930: stdout chunk (state=3): >>><<< 18662 1726867350.00938: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867350.00941: _low_level_execute_command(): starting 18662 1726867350.00944: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867349.9269345-20769-199757927690799/AnsiballZ_command.py && sleep 0' 18662 1726867350.01344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867350.01347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867350.01349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867350.01351: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867350.01353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 18662 1726867350.01355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867350.01404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867350.01409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867350.01450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867350.34941: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3963 0 --:--:-- --:--:-- --:--:-- 4013\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 3778 0 --:--:-- --:--:-- --:--:-- 3828", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 17:22:30.169977", "end": "2024-09-20 17:22:30.347878", "delta": "0:00:00.177901", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 18662 1726867350.36614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 18662 1726867350.36640: stderr chunk (state=3): >>><<< 18662 1726867350.36643: stdout chunk (state=3): >>><<< 18662 1726867350.36661: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3963 0 --:--:-- --:--:-- --:--:-- 4013\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 3778 0 --:--:-- --:--:-- --:--:-- 3828", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 17:22:30.169977", "end": "2024-09-20 17:22:30.347878", "delta": "0:00:00.177901", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 18662 1726867350.36698: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867349.9269345-20769-199757927690799/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 18662 1726867350.36705: _low_level_execute_command(): starting 18662 1726867350.36710: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867349.9269345-20769-199757927690799/ > /dev/null 2>&1 && sleep 0' 18662 1726867350.37140: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 18662 1726867350.37178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 18662 1726867350.37184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 18662 1726867350.37186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867350.37188: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 18662 1726867350.37190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 18662 1726867350.37192: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 18662 1726867350.37239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 18662 1726867350.37245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 18662 1726867350.37247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 18662 1726867350.37292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 18662 1726867350.39129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 18662 1726867350.39151: stderr chunk (state=3): >>><<< 18662 1726867350.39155: stdout chunk (state=3): >>><<< 18662 1726867350.39168: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 18662 1726867350.39174: handler run complete 18662 1726867350.39193: Evaluated conditional (False): False 18662 1726867350.39201: attempt loop complete, returning result 18662 1726867350.39203: _execute() done 18662 1726867350.39206: dumping result to json 18662 1726867350.39213: done dumping result, returning 18662 1726867350.39219: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [0affcac9-a3a5-efab-a8ce-00000000057f] 18662 1726867350.39223: sending task result for task 0affcac9-a3a5-efab-a8ce-00000000057f 18662 1726867350.39324: done sending task result for task 0affcac9-a3a5-efab-a8ce-00000000057f 18662 1726867350.39327: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.177901", "end": "2024-09-20 17:22:30.347878", "rc": 0, "start": "2024-09-20 17:22:30.169977" } STDOUT: CHECK DNS AND CONNECTIVITY 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 3963 0 --:--:-- --:--:-- --:--:-- 4013 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 3778 0 --:--:-- --:--:-- --:--:-- 3828 18662 1726867350.39390: no more pending results, returning what we have 18662 1726867350.39393: results queue empty 18662 1726867350.39394: checking for any_errors_fatal 18662 1726867350.39403: done checking for any_errors_fatal 18662 1726867350.39404: checking for max_fail_percentage 18662 1726867350.39405: done checking for max_fail_percentage 18662 1726867350.39406: checking to see if all hosts have failed and the running result is not ok 18662 1726867350.39407: done checking to see if all hosts have failed 18662 1726867350.39413: getting the remaining hosts for this loop 18662 1726867350.39414: done getting the remaining hosts for this loop 18662 1726867350.39418: getting the next task for host managed_node2 18662 1726867350.39426: done getting next task for host managed_node2 18662 1726867350.39428: ^ task is: TASK: meta (flush_handlers) 18662 1726867350.39429: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867350.39433: getting variables 18662 1726867350.39434: in VariableManager get_vars() 18662 1726867350.39462: Calling all_inventory to load vars for managed_node2 18662 1726867350.39464: Calling groups_inventory to load vars for managed_node2 18662 1726867350.39467: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867350.39479: Calling all_plugins_play to load vars for managed_node2 18662 1726867350.39482: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867350.39485: Calling groups_plugins_play to load vars for managed_node2 18662 1726867350.40534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867350.42226: done with get_vars() 18662 1726867350.42247: done getting variables 18662 1726867350.42317: in VariableManager get_vars() 18662 1726867350.42326: Calling all_inventory to load vars for managed_node2 18662 1726867350.42328: Calling groups_inventory to load vars for managed_node2 18662 1726867350.42330: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867350.42335: Calling all_plugins_play to load vars for managed_node2 18662 1726867350.42337: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867350.42340: Calling groups_plugins_play to load vars for managed_node2 18662 1726867350.43495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867350.45122: done with get_vars() 18662 1726867350.45147: done queuing things up, now waiting for results queue to drain 18662 1726867350.45149: results queue empty 18662 1726867350.45150: checking for any_errors_fatal 18662 1726867350.45153: done checking for any_errors_fatal 18662 1726867350.45153: checking for max_fail_percentage 18662 1726867350.45155: done checking for max_fail_percentage 18662 1726867350.45155: checking to see if all hosts have failed and the running result is not ok 18662 1726867350.45156: done checking to see if all hosts have failed 18662 1726867350.45157: getting the remaining hosts for this loop 18662 1726867350.45158: done getting the remaining hosts for this loop 18662 1726867350.45160: getting the next task for host managed_node2 18662 1726867350.45164: done getting next task for host managed_node2 18662 1726867350.45165: ^ task is: TASK: meta (flush_handlers) 18662 1726867350.45166: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867350.45169: getting variables 18662 1726867350.45169: in VariableManager get_vars() 18662 1726867350.45179: Calling all_inventory to load vars for managed_node2 18662 1726867350.45181: Calling groups_inventory to load vars for managed_node2 18662 1726867350.45183: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867350.45188: Calling all_plugins_play to load vars for managed_node2 18662 1726867350.45191: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867350.45193: Calling groups_plugins_play to load vars for managed_node2 18662 1726867350.46415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867350.48008: done with get_vars() 18662 1726867350.48028: done getting variables 18662 1726867350.48073: in VariableManager get_vars() 18662 1726867350.48083: Calling all_inventory to load vars for managed_node2 18662 1726867350.48085: Calling groups_inventory to load vars for managed_node2 18662 1726867350.48087: Calling all_plugins_inventory to load vars for managed_node2 18662 1726867350.48091: Calling all_plugins_play to load vars for managed_node2 18662 1726867350.48093: Calling groups_plugins_inventory to load vars for managed_node2 18662 1726867350.48096: Calling groups_plugins_play to load vars for managed_node2 18662 1726867350.49256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 18662 1726867350.50806: done with get_vars() 18662 1726867350.50834: done queuing things up, now waiting for results queue to drain 18662 1726867350.50836: results queue empty 18662 1726867350.50837: checking for any_errors_fatal 18662 1726867350.50838: done checking for any_errors_fatal 18662 1726867350.50839: checking for max_fail_percentage 18662 1726867350.50840: done checking for max_fail_percentage 18662 1726867350.50841: checking to see if all hosts have failed and the running result is not ok 18662 1726867350.50841: done checking to see if all hosts have failed 18662 1726867350.50842: getting the remaining hosts for this loop 18662 1726867350.50843: done getting the remaining hosts for this loop 18662 1726867350.50846: getting the next task for host managed_node2 18662 1726867350.50849: done getting next task for host managed_node2 18662 1726867350.50850: ^ task is: None 18662 1726867350.50851: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 18662 1726867350.50852: done queuing things up, now waiting for results queue to drain 18662 1726867350.50853: results queue empty 18662 1726867350.50855: checking for any_errors_fatal 18662 1726867350.50856: done checking for any_errors_fatal 18662 1726867350.50856: checking for max_fail_percentage 18662 1726867350.50857: done checking for max_fail_percentage 18662 1726867350.50858: checking to see if all hosts have failed and the running result is not ok 18662 1726867350.50858: done checking to see if all hosts have failed 18662 1726867350.50859: getting the next task for host managed_node2 18662 1726867350.50862: done getting next task for host managed_node2 18662 1726867350.50862: ^ task is: None 18662 1726867350.50864: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=83 changed=3 unreachable=0 failed=0 skipped=73 rescued=0 ignored=1 Friday 20 September 2024 17:22:30 -0400 (0:00:00.629) 0:00:45.144 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.06s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.06s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.94s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.51s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 Gathering Facts --------------------------------------------------------- 1.21s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 fedora.linux_system_roles.network : Check which packages are installed --- 1.20s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.17s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.16s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Create veth interface lsr27 --------------------------------------------- 1.14s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.13s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.12s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Gathering Facts --------------------------------------------------------- 1.08s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Gathering Facts --------------------------------------------------------- 1.08s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.03s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.01s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 fedora.linux_system_roles.network : Check which packages are installed --- 0.99s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.92s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.83s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 18662 1726867350.50971: RUNNING CLEANUP